PC&P (Pictures, Culture & Politics) P & C (Papers & Coffee) PP&P (Pub, Pint & Peanuts)
Wednesday, August 31, 2011
Tuesday, August 30, 2011
Monday, August 29, 2011
Integrate, Learn The Language
Stroll around Carmarthen market and it quickly becomes clear how important the Welsh language is in this historic town. A young woman is discussing the price of lamb with a butcher in animated Welsh while a teenager in a hoodie is in the bookshop choosing a greetings card in the same language. At the cafe's busy tables there is little evidence of any English at all being spoken.
But language campaigners and many Welsh-speaking residents are warning such vibrancy could be lost if proposals to build thousands of new homes in Carmarthenshire go ahead.
The county council's local development plan makes the case that projected increases in population mean more than 11,000 new homes are needed in this corner of south-west Wales, including 1,200 on the edge of Carmarthen.
The fear among many proud locals is that the majority of the people moving in will not speak Welsh, a change that would pose a "huge threat" to the language.
Town councillor and sheriff Alun Lenny said the language was a fundamental part of Carmarthen life. "Thousands of people live their lives through the medium of Welsh," he said. "It's part of our being. People use Welsh when they shop, when they worship, when they socialise. Much of civic life is carried out in Welsh. It's not a superficial, quirky element."
Lenny says the plans could increase the size of the town, which has about 15,000 residents, by a fifth. "Since the 60s, there has been a constant battle to maintain the language as part of the fabric of present life and society," he said. "These plans threaten to throw all that out of kilter if many hundreds of people who don't speak Welsh – and don't wish to – suddenly move in."
Carmarthen has a wonderfully rich history. It claims to be the oldest town in Wales and the birthplace of Merlin (Myrddin in Welsh), the legendary Welsh prophet and wizard. The Romans and Normans built fortifications here and the Black Book of Carmarthen, a collection of poetry, is one of the earliest surviving manuscripts written solely in Welsh. Just over 50% of the county's population speak Welsh.
In the 1980s there was an influx of people looking to break away from English and Welsh cities, but they tended to be younger people with families who, if anything, gave the language a boost because they put children into the local schools, where they learnt Welsh.
The fear is that the next wave of new arrivals will be older people, retiring to Carmarthenshire, attracted by the relatively low property prices and the proximity of lovely countryside and beaches. But, it is felt, many of them will not bother to learn the language.
Sioned Elin, the Carmarthenshire chair of the Welsh language campaign group Cymdeithas yr Iaith Gymraeg, called for the south-west Wales development to be scrapped, claiming no thorough assessments had been made on the impact of housing developments on the language. "Such assessments would have almost certainly shown a huge threat," she said.
Cymdeithas yr Iaith Gymraeg says what is happening in Carmarthenshire is just one example of a "national crisis". Similar protests against developments are taking place in other areas, including Denbighshire in north-east Wales, where councillors have approved proposals for thousands of new homes. The fear there is that they will be grabbed by commuters from Cheshire and Liverpool.
The Plaid Cymru MEP Jill Evans has raised her concerns with the European Parliament about the Denbighshire plans and is backing the launch of a national movement – calling itself Waking the Dragon – against such developments.
Over the summer, the Welsh assembly government has consulted over new proposals to specify how the language issue should be factored into the local planning process. It accepts the impact of "demographic change" ought to be taken into account. The results of the consultation are being analysed and assembly members are expected to examine them in the autumn.
In its development plan, Carmarthenshire insists that any proposals will have to take the language issue into account. Developers, for example, will have to submit a "linguistic impact assessment or statement" as part of planning applications.
If problems are anticipated, "mitigation measures" such as making sure a number of homes are affordable to local people or providing "support for the language within the community" should be established.
But the plan says that it has to identify new sites for housing and businesses for the good of Carmarthenshire.
Clive Scourfield, the county council's executive board member for regeneration, accepted it was inevitable that many people arriving in the area would not speak Welsh. He urged communities to encourage newcomers to learn the language.
Back in Carmarthen market, Llio Silyn, who runs the Welsh bookshop, said she was worried that young people would not be able to afford the new homes and be squeezed out. "We haven't got anything against people from Cardiff or from England, but the worry is that they will come and push our young people out."
Silyn, who appeared in Hedd Wyn, an Oscar-nominated Welsh language film, said Welsh was vital to the area and its people: "It's a big part of who we are and our place in the world."
Language that refuses to die
• Welsh is a Celtic language, closely related to Cornish and Breton. The Welsh spoken today is directly descended from the language of the sixth century.
• The passing of the 1536 and 1542 Acts of Union made English the language of law and administration of government. Although the Welsh language was not banned, it lost its status and centuries of steady linguistic decline followed.
• Until the mid-19th century, the majority of the Welsh population could speak Welsh – more than 80%.
• The 2001 census showed that 20.8% of the population was able to speak Welsh (582,400 people), an increase compared to the 1991 census (18.7%).
• Welsh-speaking heartlands include Carmarthenshire in the south-west, Gwynedd, Anglesey, Conwy, Denbighshire in the north and Ceredigion in the west.
• The number of communities where more than 70% of the population was able to speak Welsh dropped to 54 according to the 2001 census, compared with 92 in 1991. It is argued that a high density of speakers is required for Welsh to be an everyday language of the community.
• Migration has had a profound effect on traditional heartlands, with many Welsh-speaking young people moving to urban areas to work, coupled with the arrival of people unable to speak Welsh.
Selling Content - That's a Job
Ten years is, of course, a long time in media. Ten years ago, if you wanted to download some music, your best bet was Napster or one of the filesharing systems such as LimeWire or KaZaA. There were legal services, but they were so dire they wouldn't pass much muster today: there was PressPlay and MusicNet (from rival groups of record companies), which required $15 a month subscriptions for low-quality streaming (when most people had dialup connections, not today's broadband). You couldn't burn to CD. They were stuffed with restrictive software to prevent you sharing the songs.
What happened? Steve Jobs happened, mainly. The hardware and design team at Apple came up with the iPod (initially intended to be a way to sell more Macintosh computers), and then followed the iTunes Music Store – a great way to tie people to Apple by selling music. In 2003 Jobs persuaded the music companies – which wouldn't license their songs to bigger names like Microsoft – to go with him because, he said, Apple was tiny (which it was, at the time). The risk if people did start sharing songs from the store was minimal, he argued. The record labels looked at Apple's tiny market share (a few per cent of the PC market) and reckoned they'd sell about a million songs a year, so they signed up.
Apple sold a million in the first week of the iTunes Music Store being open (and only in the US). It sold 3m within a month. It's never looked back.
Nowadays Apple sells TV shows, films, books, apps, as well as music. We take the explosion in available content for granted. But without Jobs, it's likely we wouldn't be here at all; his negotiating skill is the thing that Apple, and possibly the media industry, will miss the most, because he got them to open up to new delivery mechanisms.
Content companies have been reluctant to let their products move to new formats if they aren't the inventors, or at least midwives. Witness Blu-ray, a Sony idea which wraps up the content so you can't ever get it off the disc (at least in theory); or 3D films. Yet neither is quite living up to its promise, and part of that comes down to people wanting to be able to move the content around – on an iPod, iPhone, iPad or even a computer – in ways the content doesn't allow. Apps downloaded directly to your mobile? Carriers would never have allowed it five years ago. Flat-rate data plans? Ditto. But all good for content creators.
Jobs pried open many content companies' thinking, because his focus was always on getting something great to the customer with as few obstacles as possible. In that sense, he was like a corporate embodiment of the internet; except he thought people should pay for what they got. He always, always insisted you should pay for value, and that extended to content too. The App and Music Store remains one of the biggest generators of purely digital revenue in the world, and certainly the most diverse; while Google's Android might be the fastest-selling smartphone mobile OS, its Market generates pitiful revenues, and I haven't heard of anyone proclaiming their successes from selling music, films or books through Google's offerings.
Jobs's resignation might look like the end of an era, and for certain parts of the technology industry it is. For the content industries, it's also a loss: Jobs was a champion of getting customers who would pay you for your stuff. The fact that magazine apps like The Daily haven't set the world alight (yet?) isn't a failure of the iPad (which is selling 9m a quarter while still only 15 months old; at the same point in the iPod's life, just 219,000 were sold in the financial quarter, compared with the 22m – 100 times more – of its peak). It's more like a reflection of our times.
So if you're wondering how Jobs's departure affects the media world, consider that it's the loss of one of the biggest boosters of paid-for content the business ever had. Who's going to replace that?
by Charles Arthur
Sunday, August 28, 2011
The Press Pressed
There's an awful doubt beginning to infect the media scene as autumn comes. It takes the most commonplace assumption of newspaper life and hangs a great question mark on it. We're constantly told that newspapers as we know them are in a period of transition, moving to become purely digital papers on the web, on tablets, on mobiles, on gadgets as yet uninvented. There is light at the end of a long tunnel of uncertainty, a vital transition. Yet suppose, just suppose, that there's not.
Readers who read the online runes will recognise some of the doubts involved here: advertisements priced much cheaper than print, because cyberspace is infinite and therefore infinitely available; paywalls that raise useful sums that aren't quite useful enough; tablet efforts such as Rupert Murdoch's the Daily, that begin in a blaze of publicity then disappear behind a veil of silence; phone applications that seem hugely promising until you try charging a regular rate for them.
None of this means there isn't good money to be made on the net. Some specialist sheets and smooth operators are doing that already. But your average, all-purpose paper on a standard path to survival? Forget it.
On the surface of things, cause and effect seem inexorably clear. Paid circulation of national papers slides down from 13m to 9m over a couple of decades. Internet usage for news balloons from nothing to nearly 4 million uniques a day (in the case of the Daily Mail, while the Guardian reaps the fruit of phone-hacking). One side goes up, one slips back. It's a simple equation, surely? But look closer.
Take free daily papers, the ones that, for instance, have contributed hugely to the great disaster of London circulation results. Take 750,000 Metros each morning, another 100,000 City AMs, plus 750,000 Evening Standards. Look at what's being read on your commuter train or bus. People aren't sitting with an iPad: they're turning pages of reading material that cost them nothing. This isn't the average death-of-newspapers lecture. This is free print taking over from expensive print.
And expense, of course, is another factor. A pound a day for your favourite morning read, a couple of quid on a Sunday, a magazine or two for the family? Say £600 or £700 a year. It's a big item in the midst of a big squeeze, as fewer copies sold – and disasters such as the News of the World – mean fewer and fewer newsagents can make a living. You can't easily buy something you can't easily find.
Figures that lump all newspapers together as though they're the same can be pretty misleading, too. Go back to a world before the internet. In August 1970, the Daily Mirror sold 4,486,693 a day and the Daily Express some 3,605,883. Ten years later that was down to 3,624,575 and 2,224,651 respectively. In August 1990 it was 3,121,050 and 1,608,361. Think 3.5m wiped from the two biggest titles of their day in a brutal 20 years, but don't think of the curse of the net. It didn't exist. And some papers – the Mail, the Mail on Sunday, the FT, the Times – are selling as many or more today as they were in 1990. So movements in "the sector" can be somewhat misleading.
This doesn't mean that the general sales trend isn't down, by 5% or 6% a year. But general is not particular – and neither is there any convincing correlation between individual swings and roundabouts. The Mail, with its potential 80 million unique visitors online, is holding its own in print circulation; the Express, with no web effort worth mentioning, is flaking away.
So any "transition", when and if it comes, looks patchy and unpredictable. Is the Mail in print dying? Not at all. Is the Mail online snapping at its heels? Not when all those unique visitors attract only £18m a year in advertising. Does the Daily Telegraph, for all its huge web investment, see a future without print? Specifically not. Every newspaper has a different take on things to come and a different prescription for survival (while old prescriptions based on political leanings – one left, one right etc – are surely not redundant either).
It's comforting to talk about industry tendencies and share of total audience. Trinity Mirror has been doing it for years, to convince investors that its nationals are doing better than average. But no one engaging brain needs to buy the whole story. On the contrary, there are other, simpler, consequences to consider here. The Atlanta Journal-Constitution (to nominate one prize example) runs a bit short of resources. It cuts back on deliveries to outlying areas to save money. That means circulation falls faster than ever, which in turn lops away advertising cash. The old Constitution wasn't doing too badly until it decided to make things worse.
No: the supposed facts around transition can be misty going on mystic. More than that, the arithmetic doesn't always add up. If you're producing the New York Times with, say, 1,100 staff journalists (roughly double the Telegraph, Times or Guardian norm) then those reporters, commentators and correspondents – integrated to serve a print paper and a website interchangeably – are a vital expense. They bring the expertise you must have and can charge for. You need your foreign bureaus and specialists in any medium. You can't take a scythe to such costs.
The nominated nirvana of transition is an eventual digital-only operation with the heavy-industry costs of paper and presses – maybe 60% of the production bill – theoretically removed. But how does that work in an era where print and its price structures are history, and far cheaper ads and subscriptions bring in the money that's needed? Nothing you see around you anywhere in the world currently hints at revenues on the requisite scale. The Huffington Post, now beating the New York Times's numbers of unique visitors, can afford to employ only a 10th of its journalists – and still can't pay its contributors.
Free access helps ad revenue, but not enough; paid subscriptions make a contribution, but nowhere near enough. Even a profit-making news venture such as Politico, the trailblazing specialist site for US political wonks, relies on a print version and print ads to keep it in the black. And that, for all the hype, is also the case for splendidly successful specialist sites from the Wall Street Journal and FT. They prosper because they're tied to market-leading print newspapers. The two mediums can grow simultaneously, as they do at the WSJ.
Perhaps, on the Bloomberg or Thomson Reuters model, there is a life after print. But that's the weight you can place on particular services for particular audiences. A general daily paper – all-singing, all-dancing, all-providing – has a seemingly crippling cost structure to bear if it goes wholly online. One can talk, almost emotionally, about transition. But, coldly, the next generation of the news business may involve sweeping renewal instead.
Yes, you can make money from special services. Politico can do it by targeting powerful politicians. Forbes.com targets international conference organisers. Auto Trader in Britain has a vice-like grip on the used car market online (in straightforward transition from print). There are lots of good ideas; there is loads of opportunity.
But suppose – as many are beginning to do – that you look at the evolving digital landscape and ask the basic question the other way round. If you knew what you know now, would anyone have invented a newspaper in the first place, rather than news services that come free on the net (from the BBC for starters) and a myriad of separate specialist strands so that users can follow their driving interests, from hedge funds to celebrity couplings? And, in most though not all cases, the answer is no: you wouldn't have invented newspapers, with their inevitably wide spread of coverage and equally inevitable burdens of cost. Least of all – alas for Murdoch's doomed, depleted Daily – would you invent a quasi-newspaper for tablet users only?
Transition, in any comforting, life-goes-on sense, is probably an illusion. Think way outside the box instead. Think unpredictable upheaval, utter transformation, to no set timetable; a stuttering, deluding rate of change. Think of a revolution we've barely begun to glimpse as yet.
by Peter Preston
SanFran is not The Big Apple
Steve Jobs's resignation was the most discussed in corporate history. Because his illness has been public knowledge for so long, and because Wall Street and the commentariat viewed his health as being synonymous with that of his company, for years Apple share prices have fluctuated with its CEO's temperature. If all the "Whither Apple without Jobs?" articles were laid end to end, they would cover quite a distance – but they never reached a conclusion.
Still, you could understand the hysteria. After all, he's the man who rescued Apple from the near-death experience it underwent in the mid-1990s. When he came back in 1996, the company seemed headed for oblivion. Granted, it was a distinctive, quirky outfit, but one that had been run into the ground by mediocre executives who had no vision, no strategy – and no operating system to power its products into the future.
Jobs came back because Apple bought NeXT, the computer workstation company he had started after being ousted by the Apple board in 1985. By acquiring NeXT, Apple got two things: the operating system that became OS X, the software that underpinned everything Apple has made since; and Jobs as "interim CEO" at a salary of $1 a year. But it was still a corporate minnow: a BMW to Microsoft's Ford. Fifteen years later, Apple had become the most valuable company in the world.
It was the greatest comeback since Lazarus. Because only an obsessive, authoritarian, visionary genius could have achieved such a transformation, it's easy to see why Wall Street has had difficulty imagining Apple without Jobs. He was, after all, the only CEO in the world with rock star status. And Apple is a corporate extension of his remarkable personality, much as Microsoft was of Bill Gates's. But Jobs has something Gates never had – a reputation so powerful as to create a reality distortion field around him.
This field has blinded people to some under-appreciated facts. While it is true, for example, that Apple – under Jobs's influence – is probably the world's best industrial design outfit, it is also a phenomenally well-run company. Proof of that comes from various sources. For example, not only does it regularly dream up beautiful, functional and fantastically complex products, but it gets them to market in working order, on time and to budget; and it has continually done so despite exploding demand. Compare that with slow-motion car crashes such as Hewlett Packard's Touchpad, RIM's BlackBerry Playbook or Microsoft's Vista operating system.
Then there's the way that Apple – in the teeth of industry scepticism – made such an astonishing success of bricks-and-mortar retailing with its high-street stores. Or ponder the fact that it became the world's most valuable corporation without incurring a single cent of debt. Instead, it sits atop a $78bn (£48bn) cash mountain: enough to buy Tesco and BT and still have loose change. Compare that with the casino capitalism practised by so many MBA-educated company leaders in the US. And finally there is the stranglehold Apple now has on a number of crucial modern markets – computers, online music, mobile devices and smartphones.
If you ask people what Steve Jobs is best remembered for, most will name a particular product. If they're from my (baby boomer) generation, it will probably be the Apple Macintosh, a computer that changed many of our lives in the 1980s. Younger generations will credit him with the iMac, iTunes and the iPod. Today's teenagers will revere him for the iPhone. But there's a good argument that Jobs's greatest creation is Apple itself in its post-1996 incarnation. If that's true, the great test of his career legacy is whether the organisation he built around his values will endure and remain faithful to them.
What are those values? He usually expressed them as aphorisms and, as news of his resignation spread , people began raking through them for clues. Many focused on what he said to John Sculley, CEO of Pepsi, when he was trying to persuade him to run Apple.
"Do you want to spend the rest of your life selling sugar water," he asked, "or do you want to change the world?" (Sculley accepted the invitation, then presided over Jobs's expulsion.) But for Jobs it was a serious question. What he was asking, as the blogger Umair Haque put it, was: "Do you really want to spend your days slaving over work that fails to inspire, on stuff that fails to count, for reasons that fail to touch the soul of anyone?"
Jobs is famously fanatical about design. In part this is about how things look (though for him it also involves simplicity of use). When the rest of the industry was building computers as grey, rectangular metal boxes, for example, he was prowling department stores and streets looking for design metaphors. For a time he thought the Mac should be like a Porsche. At another stage he wanted it to be like the Cuisinart food-processor. When the machine finally appeared in 1984, Jack Tramiel, the grizzled macho-boss of Commodore, thought it looked like a girly device that would be best sold in boutiques. What Tramiel did not realise – and Jobs did – was that ultimately computers would be consumer products and people would pay a huge premium for classy design.
In that sense he is the polar opposite of the MBA-trained, bean-counting executive. "The cure for Apple is not cost-cutting," he said in 1996, when the company was on the rocks. "The cure for Apple is to innovate its way out of its current predicament." At another point he said: "When you're a carpenter making a beautiful chest of drawers, you're not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it."
This delight in elegant work has always been the most striking aspect of Jobs's celebrated speeches introducing new Apple products in San Francisco. As he cooed over the iMac or the iPhone or the iPad, words like "beautiful", "amazing" and "awesome" tumbled out. For once they didn't sound like cynical, manipulative corporation-speak. He spoke from the heart.
It goes without saying that he is impossible to work with; most geniuses are. Yet he has built – and retained the respect of – the most remarkable design team in living memory, a group that has been responsible for more innovation than the rest of the computer industry put together. For that reason, when the time comes to sum up Jobs's achievements, most will portray him as a seminal figure in the computing industry. But Jobs is bigger than that.
To understand why, you have to look at the major communications industries of the 20th century – the telephone, radio and movies. As Tim Wu chronicles it in his remarkable book, The Master Switch, each of these industries started out as an open, irrationally exuberant, chaotic muddle of incompatible standards, crummy technology and chancers. The pivotal moment in the evolution of each industry came when a charismatic entrepreneur arrived to offer consumers better quality, higher production values and greater ease of use.
With the telephone it was Theodore Vail of AT&T, offering a unified nationwide network and a guarantee that when you picked up the phone you always got a dial tone. With radio it was David Sarnoff, who founded RCA. With movies it was Adolph Zukor, who created the Hollywood studio system.
Jobs is from the same mould. He believes that using a computer should be delightful, not painful; that it should be easy to seamlessly transfer music from a CD on to a hard drive and thence to an elegant portable player; that mobile phones should be powerful handheld computers that happen to make voice calls; and that a tablet computer is the device that is ushering us into a post-PC world. He has offered consumers a better proposition than the rest of the industry could – and they jumped at it. That's how he built Apple into the world's most valuable company. And it's why he is really the last of the media moguls.
by John Naughton
Saturday, August 27, 2011
Thursday, August 25, 2011
Tuesday, August 23, 2011
2001-2011 by Jason Burke
If, just over a decade ago, you had looked north through binoculars from frontline Taliban positions 30 miles north of Kabul, you would have seen an old Soviet-built airbase, little more than a cluster of ruined buildings, rusting metal stakes, a single battered jeep and no serviceable aircraft at all on the scarred strip of concrete shimmering in the Afghan sun. The group of scruffy Taliban fighters in filthy clothes who manned the makeshift trenches on the heights above it would probably have served grapes and tea to you as they did to the rare reporters who visited them.
If you had come back just a little later, say in the spring of 2002, you would have seen a startling difference. With the Taliban apparently defeated, the airstrip had become the fulcrum of a build-up of American and other international forces in the country that would continue inexorably over the next years. The feverish activity of the bulldozers, tents, jets and helicopters gave a sense that something extraordinary was happening. But its exact nature was still very unclear. Now, after a decade of conflict, a base the size of a small town has sprung up around the airstrip.
No soldiers at the battle of Castillon in 1453 knew they were fighting in the last major engagement of the hundred years war. No one fighting at Waterloo could have known they were taking part in what turned out to be the ultimate confrontation of the Napoleonic wars. The first world war was the great war until the second world war came along. Perhaps inevitably, then, the ongoing, interlinked and overlapping conflicts that have raged across the globe during the 10 years since 9/11 are currently without a name. In decades or centuries to come historians will no doubt find one – or several, as is usually the case. In the interim, given the one event that, in the western public consciousness at least, saw hostilities commence, "the 9/11 wars" seems an apt working title.
Al-Qaida has failed to achieve most of its key aims: there has been no global uprising of Muslim populations, no establishment of a new caliphate. Nor have changes in America's policy in the Islamic world been those desired by men such as the late Osama bin Laden. Does this mean the west has won the 9/11 wars? It has certainly avoided defeat. The power of terrorism lies in its ability to create a sense of fear far in excess of the actual threat posed to an individual. Here, governments have largely protected their citizens, and few inhabitants of western democracies today pass their lives genuinely concerned about being harmed in a radical militant attack. In July 2010, President Obama even spoke of how the US could "absorb" another 9/11, a statement that would have been inconceivable a few years before.
Despite significant damage to civil liberties in both Europe and America, institutional checks and balances appear to have worked on both sides of the Atlantic. In the face of a worrying militarisation and a commensurate growth in its offshoot, the "security" business, other forces have been strong enough to ensure that liberal democratic societies have kept their values more or less intact. The integration of minorities, always a delicate task, is generating significant tensions but is proceeding, albeit unevenly.
Even though now facing serious problems of debt, America has nonetheless been able to pay for the grotesque strategic error of the war in Iraq, at a total cost of up to a trillion dollars depending on how it is calculated, and a 10-year conflict in Afghanistan, all while financing a huge security industry at home. In 2009, American military expenditure was $661bn (£400bn), considerably more than double the total of 10 years previously, but still not enough, as Bin Laden had hoped, to fundamentally weaken the world's only true superpower. In Europe, supposedly creaking old democracies have reacted with a nimbleness and rapidity that few imagined they still possessed to counter domestic and international threats.
In short, western societies and political systems appear likely to digest this latest wave of radical violence as they have digested its predecessors. In 1911, British police reported that leftist and anarchist groups had "grown in number and size" and were "hardier than ever, now that the terrifying weapons created by modern science are available to them". The world was "threatened by forces which would be able to one day carry out its total destruction," the police warned. In the event, of course, it was gas, machine guns and artillery followed by disease that killed millions, not terrorism.
In the second decade of the 9/11 wars other gathering threats to the global commonwealth, such as climate change, will further oblige Islamic radical militants to cede much of the limelight, at least in the absence of a new, equally spectacular cycle of violence.
But if there has been no defeat for the west then there has been no victory either. Over the past 10 years, the limits of the ability of the US and its western allies to impose their will on parts of the world have been very publicly revealed. Though it is going too far to say that the first decade of the 9/11 wars saw the moment where the long decline of first Europe and perhaps America was made clear, the conflict certainly reinforced the sense that the tectonic plates of geopolitics are shifting. After its military and diplomatic checks in Iraq and Afghanistan, a chastened Britain may well have to finally renounce its inflated self-image as a power that "punches above its weight". The role of Nato in the 21st century is unclear. Above all, though the power, soft and hard, cultural and economic, military and political, of the US and Europe remains immense and often hugely underestimated, it is clear that this will not always be the case.
For many decades, the conventional wisdom has been that economic development around the globe would render liberal democracy and free-market capitalism more popular. One of the lessons of the 9/11 wars is that this optimism was misplaced. A sense of national or religious chauvinism appears often to be a corollary of a society getting richer rather than its opposite, and the search for dignity and authenticity is often defined by opposition to what is seen, rightly or wrongly, as foreign. In some places, the errors of western policy-makers over recent years have provoked a reaction that will last a long time. The socially conservative, moderately Islamist and strongly nationalist narrative that is being consolidated in Muslim countries from Morocco to Malaysia will pose a growing challenge to the ability of the US and European nations to pursue their interests on the global stage for many years to come. This, alongside the increasingly strident voices of China and other emerging nations, means a long period of instability and competition is likely.
American intelligence agencies reported in their four-yearly review in late 2008 that they judged that within a few decades the US would no longer be able to "call the shots". Instead, they predicted, America is likely to face the challenges of a fragmented planet, where conflict over scarce resources is on the rise, poorly contained by "ramshackle" international institutions. The previous review, published in December 2004, when George Bush had just been re-elected and was preparing his triumphal second inauguration, had foreseen "continued dominance" for many years to come. The difference is stark. If the years from 2004 to 2008 brought victory, then America and the west cannot afford many more victories like it.
If clear winners in the 9/11 wars are difficult to find, then the losers are not hard to identify. They are the huge numbers of men, women and children who have found themselves caught in multiple crossfires: the victims of the 9/11 strikes or of the 7/7 and Madrid bombings, of sectarian killings in Baghdad, badly aimed American drone strikes in Pakistan or attacks by teenage suicide bombers on crowds in Afghanistan. They are those executed by Abu Musab al-Zarqawi, the head of al-Qaida in Iraq until his death in 2006; those who died, sprayed with bullets by US Marines, at Haditha; those shot by private contractors careering in overpowered unmarked blacked-out four-wheel-drive vehicles through Baghdad. They are worshippers at Sufi shrines in the Punjab, local reporters trying to record what was happening to their home towns, policemen who happened to be on shift at the wrong time in the wrong place, unsuspecting tourists on summer holidays. They are the refugees who ran out of money and froze to death one by one in an Afghan winter, those many hundreds executed as "spies" by the Taliban, those gunned down as they waited for trains home at Mumbai's main railway station one autumn evening, those who died in cells in Bagram or elsewhere at the hands of their jailers, the provocative film-maker stabbed on an Amsterdam street, all the victims of this chaotic matrix of confused but always lethal wars.
The cumulative total of dead and wounded in this conflict so far is substantial, even if any estimates are necessarily very approximate.
The military dead are the best documented. Though some may have shown genuine enthusiasm for war, or even evidence of sadism, many western soldiers did not enlist with the primary motive of fighting and killing others. A significant number came from poor towns in the midwest of America or council estates in the UK and had joined up for a job, for adventure, to pay their way through college, to learn a craft. By the end of November 2010, the total of American soldiers who had died in Operation Iraqi Freedom and its successor, Operation New Dawn, was 4,409 with 31,395 wounded. More than 300 servicemen from other nations had been killed too and many more maimed, disabled or psychologically injured for life. In Afghanistan, well over 2,000 soldiers from 48 different countries had been killed in the first nine years of the conflict. These included 1,300 Americans, 340 Britons, 153 Canadians, 43 Frenchmen and 44 Germans.
Military casualties among western nations – predominantly American – in other theatres of Operation Enduring Freedom, from the Sudan to the Seychelles and from Tajikistan to Turkey, added another 100 or so. At least 1,500 private contractors died in Iraq alone.
Then there were the casualties sustained by local security forces. Around 12,000 police were killed in Iraq between 2003 and 2010. In Afghanistan, the number of dead policemen since 2002 had exceeded 3,000 by the middle of 2010. Many might have been venal, brutal and corrupt, but almost every dead Afghan policeman left a widow and children in a land where bereavement leads often to destitution. In Pakistan, somewhere between 2,000 and 4,000 policemen have died in bombing or shooting attacks. As for local military personnel in the various theatres of conflict, there were up to 8,000 Iraqi combat deaths in the 2003 war, and another 3,000 Iraqi soldiers are thought to have died over the subsequent years. In Afghanistan, Afghan National Army casualties were running at 2,820 in August 2010, while in Pakistan, around 3,000 soldiers have been killed and at least twice as many wounded in the various campaigns internally since 2001. Across the Middle East and further afield in the other theatres that had become part of the 9/11 wars, local security forces paid a heavy price too. More than 150 Lebanese soldiers were killed fighting against radical "al-Qaida-ist" militants in the Nahr al-Bared refugee camp in Lebanon in 2007, for example. There were many others, in Saudi Arabia, in Algeria, in Indonesia. In all, adding these totals together, at least 40,000 or 50,000 soldiers and policemen have so far died.
Casualties among their enemies – the insurgents or the extremists – are clearly harder to establish. Successive western commanders said that they did not "do body counts", but most units kept a track of how many casualties they believed they had inflicted, and these totals were often high. At least 20,000 insurgents were probably killed in Iraq, roughly the same number in Pakistan, possibly more in Afghanistan. In all that makes at least 60,000, again many with wives and children.
Then, of course, there are those, neither insurgent nor soldier, neither terrorist nor policeman, who were caught in a war in which civilians were not just features of the "battle space" but very often targets. In 2001, there were the 9/11 attacks themselves, of course, with their near 3,000 dead. In 2002 alone, at least 1,000 people died in attacks organised or inspired by al-Qaida in Tunisia, Indonesia, Turkey and elsewhere.
The casualties from such strikes continued to mount through the middle years of the decade. One study estimates 3,013 dead in around 330 attacks between 2004 and 2008. By the end of the first 10 years of the 9/11 wars, the total of civilians killed in terrorist actions directly linked to the group, or to al-Qaida-affliated or inspired Islamic militants, was almost certainly in excess of 10,000, probably nearer 15,000, possibly up to 20,000. To this total must be added the cost to civilians of the central battles of the 9/11 wars. In Iraq generally, estimates vary, but a very conservative count puts violent civilian deaths (excluding police) from the eve of the invasion of 2003 to the end of 2010 at between 65,000 and 125,000. They included more than 400 assassinated Iraqi academics and almost 150 journalists killed on assignment. The true number may be many, many times greater. In Afghanistan, from 7 October 2001, the day the bombing started, to mid-October 2003, between 3,000 and 3,600 civilians were killed just by coalition air strikes. Many more have died in other "collateral damage" incidents or through the actions of insurgents. The toll has steadily risen. There were probably around 450 civilian casualties in 2005. From 2006 to 2010 between 7,000 and 9,000 civilian deaths were documented, depending on the source. In 2010 alone, more than 2,000 died. In all, between 11,000 and 14,000 civilians have been killed in Afghanistan, and at least three or four times that number wounded or permanently disabled. In Pakistan, which saw the first deaths outside America of these multiple conflicts when police shot into demonstrations in September 2001, the number of casualties is estimated at around 9,000 dead and between 10,000 and 15,000 injured.
Add these admittedly rough figures together and you reach a total of well over 150,000 civilians killed. The approximate overall figure for civilian and military dead is probably near 250,000. If the injured are included – even at a conservative ratio of one to three – the total number of casualties reaches 750,000 [see footnote]. This may be fewer than the losses inflicted on combatants and non-combatants during the murderous major conflicts of the 20th century but still constitutes a very large number of people. Add the bereaved and the displaced, let alone those who have been harmed through the indirect effects of the conflict, the infant mortality or malnutrition rates due to breakdown of basic services, and the scale of the violence that we have witnessed over the past 10 years is clear.
Some day the 9/11 wars will be remembered by another name. Most of the dead will not be remembered at all.
Monday, August 22, 2011
A Black And White Answer
YOU might expect that science, particularly American science, would be colour-blind. Though fewer people from some of the country’s ethnic minorities are scientists than the proportions of those minorities in the population suggest should be the case, once someone has got bench space in a laboratory, he might reasonably expect to be treated on merit and nothing else.
Unfortunately, a study just published in Science by Donna Ginther of the University of Kansas suggests that is not true. Dr Ginther, who was working on behalf of America’s National Institutes of Health (NIH), looked at the pattern of research grants awarded by the NIH and found that race matters a lot. Moreover, it is not just a question of white supremacy. Asian and Hispanic scientists do just as well as white ones. Black scientists, however, do badly.
Dr Ginther and her colleagues analysed grants awarded by the NIH between 2000 and 2006, and correlated this information with the self-reported race of more than 40,000 applicants. Their results show that the chance of a black scientist receiving a grant was 17%. For Asians, Hispanics and whites the number was between 26% and 29%. Even when these figures were adjusted to take into account applicants’ prior education, awards, employment history and publications, a gap of ten percentage points remained.
This bias appears to arise in the NIH’s peer-review mechanism. Each application is reviewed by a panel of experts. These panels assign scores to about half the applications they receive (the others are rejected outright). Scored applications are then considered for grants by the various institutes that make up the NIH. The race of the applicant is not divulged to the panel. However, Dr Ginther found that applications from black scientists were less likely to be awarded a score than those from similarly qualified scientists of other races, and when they were awarded a score, that score was lower than the scores given to applicants of other races.
One possible explanation is that review panels are inferring applicants’ ethnic origins from their names, or the institutions they attended as students. Consciously or not, the reviewers may then be awarding less merit to those from people with “black-sounding” names, or who were educated at universities whose students are predominantly black. Indeed, a similar bias has been found in those recruiting for jobs in the commercial world. One well-known study, published in 2003 by researchers at the Massachusetts Institute of Technology and the University of Chicago, found that fictitious CVs with stereotypically white names elicited 50% more offers of interviews than did CVs with black names, even when the applicants’ stated qualifications were identical.
Another possible explanation is social networking. It is in the nature of groups of experts (which is precisely what peer-review panels are) to know both each other and each other’s most promising acolytes. Applicants outside this charmed circle might have less chance of favourable consideration. If the charmed circle itself were racially unrepresentative (if professors unconsciously preferred graduate students of their own race, for example), those excluded from the network because their racial group was under-represented in the first place would find it harder to break in.
Though Dr Ginther’s results are troubling, it is to the NIH’s credit that it has published her findings. The agency is also starting a programme intended to alter the composition of the review panels, and—appropriately for a scientific body—will conduct experiments to see whether excising potential racial cues from applications changes outcomes. Other agencies, and not just in America, should pay strict attention to all this, and ask themselves if they, too, are failing people of particular races. Such discrimination is not only disgraceful, but also a stupid waste of talent.
Blair On The Riots
Both David Cameron and Ed Miliband made excellent speeches last week and there was much to agree with in what they said. None the less, in the overall commentary on the riots, I think we are in danger of the wrong analysis leading to the wrong diagnosis, leading to the wrong prescription.
There were some proximate causes of what happened that are relatively easily dealt with. The police are under huge pressure. If they go in hard, they fear inquiry, disciplinary action and abuse. It's all very well to say that they should just follow the rules. The police need to know they have strong support from politicians and public. When the riots first occurred, they would have been naturally anxious as to how heavy to be. Once they saw the country behind them, they rallied.
But my experience with the police is they need 100% backing. Otherwise, you're asking a lot of the officer on the ground in a tough situation.
Then, some of the disorder was caused by rioters and looters who were otherwise ordinary young people who got caught in a life-changing mistake from which they will have to rebuild.
However, the big cause is the group of young, alienated, disaffected youth who are outside the social mainstream and who live in a culture at odds with any canons of proper behaviour. And here's where I don't agree with much of the commentary. In my experience, they are an absolutely specific problem that requires deeply specific solutions.
The left says they're victims of social deprivation, the right says they need to take personal responsibility for their actions; both just miss the point. A conventional social programme won't help them; neither – on their own – will tougher penalties.
The key is to understand that they aren't symptomatic of society at large. Failure to get this leads to a completely muddle-headed analysis.
Britain, as a whole, is not in the grip of some general "moral decline". I see young graduates struggling to find work today and persevering against all the odds. I see young people engaged as volunteers in the work I do in Africa, and in inter-faith projects. I meet youngsters who are from highly disadvantaged backgrounds where my Sports Foundation works in the north-east and I would say that today's generation is a) more respectable b) more responsible and c) more hard-working than mine was. The true face of Britain is not the tiny minority that looted, but the large majority that came out afterwards to help clean up.
I do think there are major issues underlying the anxieties reflected in disturbances and protests in many nations. One is the growing disparity of incomes not only between poor and rich but between those at the top and the aspiring middle class. Another is the paradigm shift in economic and political influence away from the west.
Each requires substantial change in the way we think and function.
However, I would be careful about drawing together the MPs' expenses row, bankers and phone-hackers in all this. We in politics love the grand philosophical common thread and I agree with Ed Miliband on the theme of responsibility.
I became an MP in 1983. Then, MPs were rarely full time, many didn't hold constituency surgeries and there were no rules of any bite governing expenses or political funding. So the idea that MPs today are a work-shy bunch of fraudsters, while back then they were high-minded public servants, is just rubbish: unfair, untrue and unhelpful.
Likewise with the boardroom. I agree totally with the criticisms of excess in pay and bonuses. But is this really the first time we have had people engaged in dubious financial practices or embracing greed, not good conduct? If anything, today's corporations are far more attuned to corporate social responsibility, far better in areas like the environment, far more aware of the need to be gender- and race-balanced in recruiting.
Britain gives generously to those in need abroad: faster and more than many other nations. At a time of cuts, our aid budget – which saves countless thousands of lives – is being protected. There is criticism but the remarkable thing is not how much but how little. The spirit that won the Olympic bid in 2005 – open, tolerant and optimistic – is far more representative of modern London than the criminality displayed by the people smashing shop windows.
And here is what I learned in 10 years of trying to deal with this issue. When I visited the so- called "bad areas", whether in Liverpool, Bristol, Birmingham, London or elsewhere, what I found was not a community out of control. What I found were individuals out of control in a community where the majority, even in the poorest of poor parts, was decent, law-abiding and actually desperate for action to correct the situation.
In witnessing the lifestyles these individuals have, I found two things came together. First, there was a legal system overwhelmed by the nature of the crime committed by these young people, buttressed as it is by gangs and organised crime.
Second, these individuals did not simply have an individual problem. They had a family problem. This is a hard thing to say and I am of course aware that this, too, is a generalisation. But many of these people are from families that are profoundly dysfunctional, operating on completely different terms from the rest of society, middle class or poor.
Most of them are shaping up that way by the time they are in primary school or even in nursery. They then grow up in circumstances where their role models are drug dealers, pimps, people with knives and guns, people who will exploit them and abuse them but with whom they feel a belonging. Hence the gang culture that is so destructive.
This is a phenomenon of the late 20th century. You find it in virtually every developed nation. Breaking it down isn't about general policy or traditional programmes of investment or treatment. The last government should take real pride in the reductions in inequality, the improvement in many inner-city schools and the big fall in overall crime. But none of these reaches this special group.
By the end of my time as prime minister, I concluded that the solution was specific and quite different from conventional policy. We had to be prepared to intervene literally family by family and at an early stage, even before any criminality had occurred. And we had to reform the laws around criminal justice, including on antisocial behaviour, organised crime and the treatment of persistent offenders. We had to treat the gangs in a completely different way to have any hope of success. The agenda that came out of this was conceived in my last years of office, but it had to be attempted against a constant backdrop of opposition, left and right, on civil liberty grounds and on the basis we were "stigmatising" young people. After I'd left, the agenda lost momentum. But the papers and the work are all there.
In 1993, following James Bulger's murder, I made a case in very similar terms to the one being heard today about moral breakdown in Britain. I now believe that speech was good politics but bad policy. Focus on the specific problem and we can begin on a proper solution. Elevate this into a high- faluting wail about a Britain that has lost its way morally and we will depress ourselves unnecessarily, trash our own reputation abroad and, worst of all, miss the chance to deal with the problem in the only way that will work.Friday, August 19, 2011
The Berbers
IN MOROCCO their language has been made official. In Algeria they lead protests against President Abdelaziz Bouteflika’s regime. In Tunisia they are rediscovering a long-suppressed identity. In Libya they man the rebels’ western front in the mountains south of the capital still held by Muammar Qaddafi. Even in Egypt’s oasis of Siwa, near Libya’s border, Berbers are finding that the revolution has given them a chance to revive their cultural rights.
“There is a Berber renaissance taking place across north Africa,” enthuses Mounir Kejji, a Moroccan Berber campaigner. In his country a new constitution, endorsed in a referendum on July 1st, officially recognises the Berber language for the first time, though parliament will decide what this means in practice; Arab nationalists and many Islamists have long demanded that Arabic be the sole language of administration and state education.
The authoritarian Arab nationalist regimes that dominated the region used to accuse the Berbers of threatening national cohesion. Now, shaken and in some cases overthrown, they have seen Berber activism take on a new lease of life. Even where they are a minority of only a few thousand, as in Egypt and Tunisia, Berbers have been able for the first time to form community associations.
Libya’s rebellion is fiercest in the Nafusa Mountains, a Berber heartland long neglected by the government. Colonel Qaddafi has refused to acknowledge Berber culture for most of his reign, describing it as “colonialism’s poison” intended to divide the country. Only in 2006, apparently after his son Seif al-Islam intervened, did he lift a ban on the use of Berber names.
Berbers make up about 5% of Libya’s 6m-7m people, though some activists put the figure higher. In recent weeks they have set up a radio station. The rebel-controlled Libya TV, based in Qatar, now broadcasts in Tamazight, the Berber tongue, for two hours a day. In June, says Mr Kejji, a delegation of Libyan Berbers affiliated to the rebels’ Transitional National Council put a linguistic query to their Moroccan counterparts: how should they write “army”and “national security” in Tamazight, so that Libyan uniforms could have a badge in their own language alongside Arabic?
A written script for the various Berber dialects was created only in the 20th century. Algeria’s Kabyles, a Berber people said to number 4m, have usually preferred the Latin alphabet, whereas a Tuareg alphabet, called Tifinagh, is now officially used in Morocco and has been adopted by Libyan Berbers who were banned from using it under the colonel. (The Tuareg are nomadic Berber pastoralists living mainly in southern Algeria, eastern Mali and western Niger.)
The Berber revival has rekindled enthusiasm for pan-Berber solidarity. “There’s an awareness among Berbers across north Africa of that element of their identity which they share,” says Hugh Roberts, an expert on the Maghreb. But each country in the region, he says, has its own particularities. The dream of creating a community of 20m-plus people (estimates of the total vary widely), stretching from Egypt’s western desert to the Atlantic, would be stymied by the multiplicity of Berber dialects and by the variety of political circumstances. “A single Berber identity exists only virtually—on the internet and among diaspora intellectuals,” says Mr Roberts.
Tuesday, August 16, 2011
Billionaire... And Progressive
In the process of accumulating one of the greatest fortunes the world has ever seen, Warren Buffett stands apart from the average squillionaire. Not for him the clichés of lavish mansions and superyachts, preferring instead his modest home in Omaha, Nebraska and nights in with burger and cherry cola.
Now Buffett has added to his list of atypical pronouncements by saying that America's super-rich should pay more tax if the country's debt problems are ever to be solved.
Writing in the New York Times on Monday, Buffett argued that the richest members of US society are indulged with an unfairly generous tax regime and are not making a fair contribution to repairing the country's finances.
"While the poor and middle class fight for us in Afghanistan, and while most Americans struggle to make ends meet, we mega-rich continue to get our extraordinary tax breaks," wrote Buffett, whose personal fortune was estimated at $50bn (£30bn) by Forbes this year, making him the third richest person in the world behind Carlos Slim and Bill Gates.
"These and other blessings are showered upon us by legislators in Washington who feel compelled to protect us, much as if we were spotted owls or some other endangered species. It's nice to have friends in high places," the 80-year old investor added.
Buffett, known as the Sage of Omaha, built his fortune on a no-frills investment strategy and was a fierce critic of the exotic financial investments that brought the banking system to it knees in 2008, dubbing them instruments of financial mass destruction.
A long-time critic of the US tax system, he has calculated that he handed over 17.4% of his income as tax last year – a lower proportion than any of the 20 other people who work in his office.
Under the debt ceiling deal agreed in Washington, a "super committee" of 12 congressmen and senators must find $1.5tn worth of savings and cuts to help cut America's national debt. Tax rises are hugely unpopular with elements within the Republican party, with the Tea Party movement adamant that America should balance its books by cutting public spending.
Buffett argues that this super-committee should raise the tax rate paid by those earning more than $1m a year, including earnings from capital gains which are currently taxed at a lower rate than ordinary income. Those raking in upward of $10m a year could then pay even more.
The package of tax cuts brought in by President George W Bush are set to expire at the end of 2012, although they could be extended. Many of the leading Republicans who hope to challenge Barack Obama at the next presidential election have argued for lower taxation to stimulate the US economy.
On Saturday Rick Perry, the governor of Texas, argued that it was an "injustice" that almost a half of all Americans currently pay no federal income tax.
"Spreading the wealth punishes success while setting America on a course for greater dependency on government," Perry argued as he announced his bid for the 2012 Republican nomination.
Buffett argues that the US policymakers should be looking at the other end of the spectrum. As he put it: "My friends and I have been coddled long enough by a billionaire-friendly Congress. It's time for our government to get serious about shared sacrifice."