Tuesday, September 30, 2008
The Slow Food international movement officially began when delegates from 15 countries endorsed this manifesto, written by founding member Folco Portinari, on November 9, 1989.
Our century, which began and has developed under the insignia of industrial civilization, first invented the machine and then took it as its life model.
We are enslaved by speed and have all succumbed to the same insidious virus: Fast Life, which disrupts our habits, pervades the privacy of our homes and forces us to eat Fast Foods.
To be worthy of the name, Homo Sapiens should rid himself of speed before it reduces him to a species in danger of extinction.
A firm defense of quiet material pleasure is the only way to oppose the universal folly of Fast Life.
May suitable doses of guaranteed sensual pleasure and slow, long-lasting enjoyment preserve us from the contagion of the multitude who mistake frenzy for efficiency.
Our defense should begin at the table with Slow Food.
Let us rediscover the flavors and savors of regional cooking and banish the degrading effects of Fast Food.
In the name of productivity, Fast Life has changed our way of being and threatens our environment and our landscapes. So Slow Food is now the only truly progressive answer.
That is what real culture is all about: developing taste rather than demeaning it. And what better way to set about this than an international exchange of experiences, knowledge, projects?
Slow Food guarantees a better future.
Slow Food is an idea that needs plenty of qualified supporters who can help turn this (slow) motion into an international movement, with the little snail as its symbol.
Monday, September 29, 2008
Saturday, September 27, 2008
(from Simon Hoggart`s weekly column)
Last week I mentioned how, if you offer money to St Anthony to help you find something, and he does, you must pay your debt. Some readers have reminded me of a favourite old joke along those lines: a chap is driving to a crucial appointment, but he can't find anywhere to park. He prays to God: "If you find me a space I promise I will go to church every Sunday and make a massive contribution to the steeple fund." At that very second a car pulls out and leaves a spot. The chap says, "Oh, forget it, God, I've found one ..."
The time has come for a final report on the 43rd president of the US
The man who set out to reinforce unbridled American power has weakened it in all three essential dimensions
As the two men who would succeed him train like Olympic athletes for tomorrow's foreign policy debate, pause for a moment to complete your final report on the 43rd president of the United States. What would you say?
I would sum up his two terms in four words: hubris followed by nemesis.
Remember the mood music of eight years ago. The greatest power the world has ever seen. Rome on steroids. An international system said to be unipolar, and Washington's unabashed embrace of unilateralism. The US as "Prometheus unbound", according to the neoconservative commentator Charles Krauthammer. Wall Street investment bankers bestriding the financial globe as Pentagon generals did the military globe and Harvard professors the soft power one. Masters of the universe. Personifying that hubristic moment: George Walker Bush.
And now: nemesis. The irony of the Bush years is that a man who came into office committed to both celebrating and reinforcing sovereign, unbridled national power has presided over the weakening of that power in all three dimensions: military, economic and soft. "I am not convinced we are winning it in Afghanistan," Admiral Mike Mullen, chairman of the Joint Chiefs of Staff, told a congressional committee earlier this month. Many on the ground say that's an understatement. The massive, culpable distraction of Iraq, Bush's war of choice, leaves the US - and with it the rest of the west - on the verge of losing the war of necessity. Here, resurgent in Afghanistan and Pakistan, are the jihadist enemies who attacked the US on September 11 2001. By misusing military power, Bush has weakened it.
Economically, the Bush presidency ends with a financial meltdown on a scale not seen for 70 years. The proud conservative deregulators (John McCain long among them) now oversee a partial nationalisation of the American economy that would make even a French socialist blush. A government bailout that will total close to a trillion dollars, plus the cumulative cost of the Iraq war, will push the national debt to more than $11 trillion. The flagships of Wall Street either go bust or have to be salvaged, with the help of government or foreign money. Most ordinary Americans feel poorer and less secure.
The decline in soft power - the power to attract - is also dramatic. The Pew Global Attitudes Survey has recorded a precipitous worldwide fall in favourable views of the US since 2001. The map is chequered, of course, but the distaste extends beyond policies of the Bush administration to things such as "American ways of doing business", and "American ideas about democracy". Iraq has been central to this collapse of credibility and attractiveness. When Bush denounces Russia for invading a sovereign country (Georgia), as he did again at the UN on Tuesday, a cry of "humbug" goes up around the world. Now American-style free market capitalism is taking a further hit, while some of the alternative models are looking better.
Last weekend, five former US secretaries of state - two Democrat, three Republican - gathered for a panel discussion on the future of foreign policy, televised by CNN. Asked by Christiane Amanpour what should be the biggest concern for the new president, Colin Powell replied: "To restore a sense of confidence in the United States of America." Madeleine Albright added that the world of 2009 would be full of issues "that can only be solved in cooperation with other countries". And, Republican and Democrat, they chorussed "close Guantánamo".
Even George Bush now seems to concur with this criticism of George Bush - and I don't just mean speculation that the father is privately critical of the son. Eight years ago, president Bush the younger hardly seemed to know what the word "multilateral" meant. In the course of his farewell address to the UN general assembly this week, he used the word "multilateral" 10 times.
Obviously not all this mess can be blamed on Bush: he's not responsible for the epochal rise of China, nor for jihadist terrorists' long-term hatred of the west.
But a great deal of it can. At the Truman Library in Independence, Missouri, you can still see the painted glass sign that president Harry Truman placed on his desk in the oval office: The Buck Stops Here. (On the back it says: I'm From Missouri.) The buck stops there. The contrast between the president from Missouri and the president from Texas is painful. Judgment, prudence, vision, patience, honesty - every quality that the 33rd president so signally possessed, as the US remade the world after 1945, has been signally lacking in the 43rd.
Iraq, the US's greatest strategic blunder in at least 30 years, is Bush's fault. The buck stops there. And the more we learn about it, the clearer it becomes that it was pursued with a mixture of self-deception and lies. The reporter Ron Suskind has a new book out in which he recounts how, in the runup to war, British intelligence secured unique access to Saddam Hussein's head of intelligence, Tahir Jalil Habbush.
Habbush told them what turned out subsequently to be the truth: that Saddam had ceased his programme of weapons of mass destruction, but would not admit it, because he was obsessed with keeping regional enemies such as Iran in a state of fear and uncertainty. That version was corroborated by Saddam's foreign minister, to whom French intelligence had originally secured access.
The Bush-Cheney White House ignored both reports, preferring what turned out to be the fabrications of a German intelligence source codenamed Curveball. Curveball indeed. Some of Suskind's reporting has been questioned, but the basic story is not in doubt. The Bush-Cheney White House pressed ahead to war on a fraudulent prospectus, suppressing and distorting very important contrary evidence. As a senior member of the administration told Suskind: "We're an empire now, and when we act we create our own reality." Hubris has rarely been better expressed.
Something similar happened with the vertiginous unreality of hyper-leveraged Wall Street investment banking over the past decade. The financiers' motto, too, could have been "We create our own reality". Again, nemesis follows hubris as the night the day. The White House was not directly responsible for what looks like wild financial irresponsibility, but it was responsible for not supervising and regulating it - something even John McCain is now at least implicity admitting. The buck stopped there.
As for the decline in American soft power, that is something for which George Bush was directly to blame. His arrogance, his unilateralism, his insensitivity, his long-time denial of the need for urgent action on climate change: all fed directly into the plummeting credit of the US around the world. It would have been a different story with a different president.
For years now, we have seen those who hate the US abusing and burning effigies of Bush. The truth is, the anti-Americans should be building gilded monuments to him. For no one has done more to serve the cause of anti-Americanism than GW Bush. It is we who like and admire the US who should, by rights, be burning effigies. But now, at last, we live in hope of a better America.
- Who's got your old phone's data?
Millions of mobiles are lost and discarded every year, yet their owners give little thought to the sensitive data they contain
- by Pete Warren
Three years ago, Graham Clements – the European managing director of the UK subsidiary of the Japanese packaging multinational Ishida – decided to get rid of his BlackBerry and passed it on to his IT department for recycling. At the start of this month, that BlackBerry was one of the top items on the agenda at the first board meeting that Clements had called since his return from holiday - because it, and the data on it, had come back to haunt him.
Instead of being recycled, the BlackBerry, like millions of other mobile devices every year, had been passed on to a company to be sold. On Clements's device were business plans, details of customer relationships, information on the structure of the company, details of his bank accounts and details about his children.
And Clements isn't alone. It's almost impossible for the average person to wipe a mobile phone clean: unlike a PC, which has an open architecture, mobile phones are closed books in terms of where data resides. "It has taken us over a year to get talks going with Nokia that now allows us to wipe their phones," says Jon Godfrey, director of Sims Lifecycle Services, which recycles mobiles. "We have to go through a different process with each manufacturer. To wipe it, you have to be able to access all the memory – and manufacturers don't want you to do that for all sorts of commercial reasons."
Yet every six months 63,000 phones and around 6,000 PDAs are left in cabs in London alone. At Heathrow airport, 10 phones are handed in every day; one in four has no security and can be turned on by staff. Furthermore, the security of the data on those devices is the responsibility of the person who put it on the phone. It is not illegal to read it; it is up to you to protect it.
The case of Clements is not unique. That BlackBerry was among several that were recovered from mobile phone recycling companies as part of a study into data loss on mobile devices by BT, Glamorgan University, Australia's Edith Cowan University and Sim Lifecycle Services. It was intended to demonstrate just how much data a mobile device can collect about you. For as Clements discovered, we very quickly create intensely personal relationships with these devices.
Just how personal those relationships can be was shown by one BlackBerry recovered in Australia. It revealed that its owner, a businessman, lived in an upmarket part of Sydney. It also contained the details of his various businesses, including bids and contracts under negotiations, uncomplimentary comments about employees, an extensive list of contacts and a complete log of phone calls and diary commitments. It even held extensive and lurid exchanges between the man and a woman he was conducting a clandestine affair with.
With government departments losing laptops and discs teeming with information seemingly every week, it is easy to forget how much data is held on our PDAs and phones. The problem is that very few of us take any care to secure them against loss or theft.
Over the next few years, the phone industry hopes to tempt us with new devices that will be able to hold huge amounts of information, while the financial services industry aims to turn mobiles into payment devices that incorporate credit cards. Nearly all of them are designed so they can be linked to a computer to exchange and back up data or music. When they do, virtually by default, they will exchange information from your address book and your diary.
Is that safe? No. Two years ago CESG, the technical wing of the UK government's eavesdropping organisation GCHQ, which is responsible for advising the government on technology vulnerabilities, was privately briefing that mobile phones cannot be wiped. Now, according to CESG, some measures can be taken, though its spokesman was not prepared to share precisely what those measures are. CESG says: "The government needs assurance that information has been properly erased in all forms of electronic device. Guidance is provided to departments on the most appropriate ways of achieving this. The advice provided to government departments is classified and we are not able, or prepared, to provide detail."
However, as Clements points out, this is exactly the sort of information that is needed. He says: "So what are people meant to do with things when they have finished with them?"
Storing up trouble
According to Godfrey at Sims Lifecycle Services, a discarded, unwiped phone or PDA is "a perfect tool for social engineering, and it's only going to get worse" as the storage capacity of mobile devices increases.
He says: "The point of this work is really to bring that across to people the risks that mobile phones present to their personal data." Of the devices in the survey, 7% had enough personal data on them for the individual concerned to have their identity stolen, and 7% would have allowed a corporate fraud to have taken place. Another 2% still had Sim cards in them, while 27% of the BlackBerrys in the survey had company data and 16% carried personal information.
Of the 161 devices in the survey, many were first-generation GSM phones, and only 82 could be made to work. But as Dr Andy Jones, head of information security research at BT's research centre, points out, that alone is significant. "The life expectancy of a mobile device is only slightly longer than that of a butterfly," he says. "People only hold on to their own phones for around 12 months; corporate devices go for 24 months.
"But when they are finished with, the devices are not generally considered to have any intrinsic value to the organisation. When they reach the end of their effective life, they do not appear to be given any consideration with regard to the data that they may still contain."
Says Professor Andrew Blyth of Glamorgan University's computer forensics department: "There are no tools out there at the moment that let you destroy the data on mobile phones, so I think that people need to take the appropriate measures to protect their personal data."
Dr Craig Valli, of Edith Cowan's school of computer and information science, says that many of the BlackBerry devices he analysed represented a significant risk. "Loss of these devices could have resulted in a number of secrets and sensitive information being revealed, the end result of which could have been significant criminal activity." In fact, BlackBerrys do have a remote erase routine, but it is not standard across the industry.
Most of us leave old phones in drawers or cupboards at home until they are given to charities, which pass them on to recycling companies that pay them for the devices.
And then we forget about them, until - as in Clements's case - they turn up three years later. But he was lucky. Most of the phones from recycling companies are destined for Africa and Asia - areas that are rapidly gaining a reputation for ID theft. Do you know where your last mobile phone is now? And whether it was wiped clean before you got rid of it?
Tuesday, September 23, 2008
Both the left and the right say they stand for economic growth. So should voters trying to decide between the two simply look at it as a matter of choosing alternative management teams?
If only matters were so easy! Part of the problem concerns the role of luck. America’s economy was blessed in the 1990s with low energy prices, a high pace of innovation, and a China increasingly offering high-quality goods at decreasing prices, all of which combined to produce low inflation and rapid growth.
President Clinton and then-Chairman of the US Federal Reserve Alan Greenspan deserve little credit for this – though, to be sure, bad policies could have messed things up. By contrast, the problems faced today – high energy and food prices and a crumbling financial system – have, to a large extent, been brought about by bad policies.
There are, indeed, big differences in growth strategies, which make different outcomes highly likely. The first difference concerns how growth itself is conceived. Growth is not just a matter of increasing GDP. It must be sustainable: growth based on environmental degradation, a debt-financed consumption binge, or the exploitation of scarce natural resources, without reinvesting the proceeds, is not sustainable.
Growth also must be inclusive; at least a majority of citizens must benefit. Trickle-down economics does not work: an increase in GDP can actually leave most citizens worse off. America’s recent growth was neither economically sustainable nor inclusive. Most Americans are worse off today than they were seven years ago.
But there need not be a trade-off between inequality and growth. Governments can enhance growth by increasing inclusiveness. A country’s most valuable resource is its people. So it is essential to ensure that everyone can live up to their potential, which requires educational opportunities for all.
A modern economy also requires risk-taking. Individuals are more willing to take risks if there is a good safety net. If not, citizens may demand protection from foreign competition. Social protection is more efficient than protectionism.
Failures to promote social solidarity can have other costs, not the least of which are the social and private expenditures required to protect property and incarcerate criminals. It is estimated that within a few years, America will have more people working in the security business than in education. A year in prison can cost more than a year at Harvard. The cost of incarcerating two million Americans – one of the highest per capita rates in the world – should be viewed as a subtraction from GDP, yet it is added on.
A second major difference between left and right concerns the role of the state in promoting development. The left understands that the government’s role in providing infrastructure and education, developing technology, and even acting as an entrepreneur is vital. Government laid the foundations of the Internet and the modern biotechnology revolutions. In the nineteenth century, research at America’s government-supported universities provided the basis for the agricultural revolution. Government then brought these advances to millions of American farmers. Small business loans have been pivotal in creating not only new businesses, but whole new industries.
The final difference may seem odd: the left now understands markets, and the role that they can and should play in the economy. The right, especially in America, does not. The New Right, typified by the Bush-Cheney administration, is really old corporatism in a new guise.
These are not libertarians. They believe in a strong state with robust executive powers, but one used in defense of established interests, with little attention to market principles. The list of examples is long, but it includes subsidies to large corporate farms, tariffs to protect the steel industry, and, most recently, the mega-bail-outs of Bear Stearns, Fannie Mae, and Freddie Mac. But the inconsistency between rhetoric and reality is long-standing: protectionism expanded under Reagan, including through the imposition of so-called voluntary export restraints on Japanese cars.
By contrast, the new left is trying to make markets work. Unfettered markets do not operate well on their own – a conclusion reinforced by the current financial debacle. Defenders of markets sometimes admit that they do fail, even disastrously, but they claim that markets are “self-correcting.” During the Great Depression, similar arguments were heard: government need not do anything, because markets would restore the economy to full employment in the long run . But, as John Maynard Keynes famously put it, in the long run we are all dead.
Markets are not self-correcting in the relevant time frame. No government can sit idly by as a country goes into recession or depression, even when caused by the excessive greed of bankers or misjudgment of risks by security markets and rating agencies. But if governments are going to pay the economy’s hospital bills, they must act to make it less likely that hospitalization will be needed. The right’s deregulation mantra was simply wrong, and we are now paying the price. And the price tag – in terms of lost output – will be high, perhaps more than $1.5 trillion in the United States alone.
The right often traces its intellectual parentage to Adam Smith, but while Smith recognized the power of markets, he also recognized their limits. Even in his era, businesses found that they could increase profits more easily by conspiring to raise prices than by producing innovative products more efficiently. There is a need for strong anti-trust laws.
It is easy to host a party. For the moment, everyone can feel good. Promoting sustainable growth is much harder. Today, in contrast to the right, the left has a coherent agenda, one that offers not only higher growth, but also social justice. For voters, the choice should be easy.
Joseph E. Stiglitz, Professor at Columbia University, received the 2001 Nobel Prize in economics. He is the co-author, with Linda Bilmes, of The Three Trillion Dollar War: The True Costs of the Iraq Conflict.www.project-syndicate.org
Saturday, September 20, 2008
It is beyond the scope of Micheál Ó Siochrú's excellent book to explore in detail the impact of Cromwell's legacy on modern Irish politics. However, he does provide a telling anecdote: in 1997 Robin Cook, the newly appointed foreign secretary, received a courtesy visit from Bertie Ahern, the Irish Taoiseach. On entering the office, the Irishman immediately walked out again, refusing to return until Cook took down from the wall a picture "of that murdering bastard" Cromwell.
Oliver Cromwell and the Conquest of Ireland
by Micheál Ó Siochrú
336 pages, Faber
Cook was unusual among New Labour grandees in that he had a genuine sense of the historical, but he probably never thought much about Cromwell in Ireland. Over the years, I have often been dismayed by my leftwing English friends' sympathy for Cromwell's cause. They tend to see him as a radical hero of the English civil war, politically flawed to be sure, especially when he sided with the propertied elite during the Putney debates, but without whose military genius and political vision the revolution of 1649 would not have been possible. His personal traumas and his constant, neurotic search for the true path can also make him seem attractively post-Freudian and modern, certainly compared to his great adversary Charles I, whose psychological outlook was so one-dimensionally rooted in the medieval theory of the divine right of kings that self-interrogation of any kind was impossible.
My friends are merely following generations of historians who have gathered to praise "the reluctant and apologetic dictator". Macaulay saw in Cromwell "the best qualities of the middle classes". To SR Gardiner he was no less than "the greatest Englishman of all time", while Churchill saluted his "place in the forward march of liberal ideas". When in the 1970s Ivan Roots edited a collection of essays on the great man, he found time for chapters on "Cromwell's Genius", "The Achievement of Oliver Cromwell" and even "Industrial Laissez-Faire and the Policy of Cromwell", but no space for an examination of what he did in Ireland. Still more recently, Cromwell's cause has been helped by the eccentric outpourings of amateur Irish historians who have absolved their hero from charges of mass murder and declared him "an honourable enemy", much to the delight of some pro-Union commentators.
By the time Cromwell landed with his troops at Ringsend outside Dublin, on August 15 1649, Ireland had been in rebellion for eight years. What began as an attempted coup by the Catholic Irish nobility descended into civil war and atrocities such as the massacre of Protestant settlers in Ulster in the winter of 1641 (at the time puritan MPs claimed 200,000 or more had died, though recent estimates put the figure at 4,000).
The deaths were used to whip up virulent anti-Catholic sentiment and certainly fed the cycle of revenge and cruelty that ensued. On Rathlin Island, Scottish Protestants wiped out the Catholic population, throwing women from the clifftops. As the rebellion continued, an uneasy alliance emerged of the native Irish, English settlers and those still loyal to the Stuart monarchy. Much of Ó Siochrú's narrative is taken up with the complex backgrounds and conflicting interests of various parties, and it is by far the best account of the confederate wars, as they came to be known, that I have read.
The outbreak of hostilities between king and parliament in the summer of 1642 diverted parliament's attention away from the rebellion. Not until after Charles I had been executed could the new republican government muster the resources to conquer Ireland. Cromwell was chosen to lead the invading force. Addressing a crowd of Irish Protestants in Dublin, Cromwell promised rewards for all those carrying on "that great work against the barbarous and bloodthirsty Irish". Promising to restore "that bleeding nation to its former happiness and tranquillity", Cromwell advanced to Drogheda, which was defended by the royalist Sir Arthur Aston.
Aston refused Cromwell's summons to surrender, and on September 11, after fierce fighting, the town was stormed. In his letters to parliament, Cromwell stated simply that he forbade his men "to spare any that were in arms". The scale of the killing was unprecedented. Cromwell's apologists have sought to excuse him on the grounds that while the sack of Drogheda was harsh, it was in accordance with the laws of war. Ó Siochrú argues that even during the thirty years war in Germany, considered by contemporaries a byword for wanton cruelty, massacres of this nature were extremely rare, and he cites the defensive tone of Cromwell's letters to parliament as suggestive of his own recognition that he had allowed something terrible to happen.
After Drogheda Cromwell turned south again and there was a rerun at Wexford, with the garrison and many civilians losing their lives. Cromwell justified his actions in part by claiming that the Irish would be terrorised into surrender, but it is more likely that it was parliament's money that secured the day for Cromwell's men: fighting men like to be paid, and his opponents had but a fraction of his resources.
I would have liked a longer discussion of the anti-Catholic and anti-Irish context in which Cromwell operated. The parliamentarian general the Earl of Essex approved the execution of "absolute Irish . . . for he would not have quarter allowed to those". As Michael Braddick says: "By the autumn of 1644 this was near to official policy." In other words, Cromwell's vicious hatred of the Irish was nothing new and it was not particular to him. But this is a quibble, for I don't see how Cromwell's reputation can survive this important book. Ó Siochrú's calm and forensic reconstruction of events at Drogheda and Wexford show "the greatest Englishman of all time" to have been a pitiless mass murderer. I will be sending it to English friends this Christmas, along with a card inviting them to join a campaign to have Cromwell's statue outside parliament pulled down, cut up and chucked into the Irish Sea.
Thursday, September 18, 2008
Playing the Enemy: Nelson Mandela and the Game that Made a Nation.
By John Carlin.
TOWARDS the end of his 27 years in jail, Nelson Mandela began to yearn for a hotplate. He was being well fed by this point, not least because he was the world’s most famous political prisoner. But his jailers gave him too much food for lunch and not enough for supper. He had taken to saving some of his mid-day meal until the evening, by which time it was cold, and he wanted something to heat it up.
The problem was that the officer in charge of Pollsmoor prison’s maximum-security “C” wing was prickly, insecure, uncomfortable talking in English and virtually allergic to black political prisoners. To get around him, Mr Mandela started reading about rugby, a sport he had never liked but which his jailer, like most Afrikaner men, adored. Then, when they met in a corridor, Mr Mandela immediately launched into a detailed discussion, in Afrikaans, about prop forwards, scrum halves and recent games. His jailer was so charmed that before he knew it he was barking at an underling to “go and get Mandela a hotplate!”
Mr Mandela’s story never fails to inspire. As a young man, he started an armed struggle against apartheid. It went nowhere, and he went to jail. While maturing behind bars, he decided that moral suasion might work where bombs had failed. It did. South Africa’s white rulers surrendered power without a civil war. Several books have been written about Mr Mandela’s crucial role in coaxing his countrymen towards a more civilised form of government. John Carlin’s is the first to tell the tale through the prism of sport.
This premise is not as odd as it sounds. It was not only Mr Mandela’s regal charm that won over white South Africans. It was the fact that he took the trouble to study and understand their culture. At a time when many blacks dismissed rugby as “the brutish, alien pastime of a brutish, alien people”, Mr Mandela saw it as a bridge across the racial chasm.
The game is not an incidental part of Afrikaner culture, like cricket is to the English. To many Afrikaners, who have grown up playing rough games on sun-baked ground so hard that every tumble draws blood, rugby is little short of everything. Mr Mandela knew that if he was to convince these people that one man, one vote would not mean catastrophe, he had to “address their hearts”, not their brains. If the fearsome terrorist on the other side of the negotiating table was a rugby fan, could he really be as bad as they thought?
Mr Carlin focuses on the decade after 1985, when most blacks thought the country was sliding into war. He draws on his experiences as the South Africa correspondent for the Independent, a British newspaper, during the transition to democracy. But the book does not climax, as a standard historical text might, with South Africa’s first proper multi-racial elections in 1994. Instead, it builds up to the rugby world cup final in 1995, which was held in South Africa and which the home team won.
This makes sense. Elections are all very well, but the moment when black South Africans started cheering for a mostly-white rugby team, when white fans in the stadium tried gamely to sing a Zulu miners’ anthem and when Mr Mandela donned the green jersey of the Springboks—“It was the moment I realised that there really was a chance this country could work,” gushes a teary-eyed rugby official.
Mr Carlin brings the story alive by telling it through the eyes of a broad spectrum of South Africans. Among these is Desmond Tutu, the Nobel prize-winning archbishop of Cape Town, who was in America on the day of the final and had to find a bar that would let him watch it at an ungodly hour of the morning. Also, Niel Barnard, a former chief spy for the apartheid regime, who used to keep a thick file on Bishop Tutu. And Justice Bekebeke, a young township firebrand who killed a policeman for firing at a child during a riot and spent time on death row.
Mr Bekebeke is the most interesting of Mr Carlin’s portraits. On the morning of the match, he is still too bitter about a lifetime of injustice to support the Springboks. But then, something changes. The surging emotion of the event sweeps him along. “I just had to give up, to surrender,” he says, “And I said to myself, well, this is the new reality. There is no going back: the South African team is now my team, whoever they are, whatever their colour.”
Many writers reveal the nuts and bolts of South Africa’s transformation to non- racial democracy. But few capture the spirit as well as Mr Carlin.
(Dedicated To a Friend of a Friend)
Cancer may be caused by stem cells gone bad. If that proves to be correct, it should revolutionise treatment
MUCH of medical research is a hard slog for small reward. But, just occasionally, a finding revolutionises the field and cracks open a whole range of diseases. The discovery in the 19th century that many illnesses are caused by bacteria was one such. The unravelling of Mendelian genetics was another. It now seems likely that medical science is on the brink of a finding of equal significance. The underlying biology of that scourge of modern humanity, cancer, looks as though it is about to yield its main secret. If it does, it is possible that the headline-writer’s cliché, “a cure for cancer”, will come true over the years, just as the antibiotics that followed from the discovery of bacteria swept away previously lethal infectious diseases.
The discovery—or, rather, the hypothesis that is now being tested—is that cancers grow from stem cells in the way that healthy organs do. A stem cell is one that, when it divides, produces two unequal daughters. One remains a stem cell while the other multiplies into the sorts of cells required by its organ. This matters for cancer because, at the moment, all the cells of a tumour are seen as more or less equivalent. Therapies designed to kill them do not distinguish between them. Success is defined as eliminating as many of them as possible, so those therapies have been refined to do just that. However, if all that the therapies are doing is killing the descendants of the non-stem-cell daughters, the problem has not been eliminated. Instead of attacking the many, you have to attack the few. That means aiming at the stem cells themselves.
Not all investigators support the cancer-stem-cell hypothesis, but the share who do so is growing rapidly. A mere five years ago, few research papers on the subject were presented at big academic meetings. This year there were hundreds at one such meeting alone. Moreover, data from clinical trials based on the hypothesis suggest that it has real value for patients. As a result, drug companies have taken notice and are trying to develop substances that will kill cancer stem cells.
The root cause of both cancer and stem cells is multicellularity. In the distant past, when all living things had only one cell, that cell’s reproduction was at a premium. In the body of an animal, however, most cells have taken a vow of self-denial. Reproduction is delegated to the sex cells. The rest, called somatic cells, are merely supporting actors, specialised for the tasks needed to give the sex cells a chance to get into the next generation. For this to happen required the evolution of genes that were able to curb several billion years’ worth of instinct to proliferate without killing that instinct entirely. Only then could somatic cells do their job, and be present in appropriate numbers.
The standard model of tumour formation was based on the fact that somatic cells slowly accumulate mutations. Sometimes these disable the anti-proliferation genes. If enough of the brakes come off in a somatic cell, so the theory went, it will recover its ancestral vigour and start growing into a tumour. Cancer, then, is an inevitable cost of being multicellular.
The discovery of stem cells changed this picture subtly, but importantly. Blood stem cells were found a long time ago, but only recently has it become apparent that all tissues have stem cells. The instincts of stem cells lie halfway between those of sex cells and ordinary body cells. They never stop reproducing, but they cannot look forward to making the generational leap. When the body dies, so do they. However, they are few in number, and because at cell division only one daughter continues to be a stem cell, that number does not grow.
This division of labour may even be another type of anti-cancer mechanism. It allows stringent locks to be put on somatic cells (which, for example, strictly limit the number of times they can divide), yet it permits tissue to be renewed. Without stem cells, such tissue-renewal would be the province of any and every somatic cell—a recipe, as the traditional model observes, for tumorous disaster. The obverse of this, however, is that if a stem cell does mutate into something bad, it is likely to be very bad indeed. That, in essence, is the stem-cell hypothesis of cancer.
One obvious prediction of this hypothesis is that tumours will have at least two sorts of cell in them: a dominant population of daughter cells and a minority one of stem cells. The first person to show that to be true was John Dick, a molecular biologist at the University of Toronto. In 1997 he isolated what looked like stem cells from a blood cancer called acute myeloid leukaemia (AML). Blood cancers are easier to deal with in this context than solid tumours because their cells do not have to be separated from one another before they are examined. One characteristic of AML cells is that they have two sorts of a protein, called CD34 and CD38, on their surfaces. Dr Dick thus used two sets of special antibodies for his experiment. One sort stuck only to the CD34 molecule, the other only to CD38. Each sort was also attached to a fluorescent tag.
By mixing the AML cells from his patients with the two antibodies and running them through a machine that sorted them according to how they fluoresced, he showed that most were positive for both proteins. However, a small fraction (as low as 0.2%) were positive only for CD34. These, he suspected, were the stem cells.
He was able to confirm this by injecting the minority cells into mice. The resulting tumours had the same mix of cells as those from human patients. However, when he injected mice with samples from the majority cells, with both sorts of the protein, no tumours resulted. The CD34-only cells thus acted as cancer stem cells.
Moreover, this phenomenon was not confined to leukaemia. In 2003 a group of researchers at the University of Michigan in Ann Arbor, led by Max Wicha and Michael Clarke, used a similar trick on breast-cancer cells. In this case the surface proteins were known as CD24 and CD44, and the minority were those positive only for CD44. As with AML, these minority cells produced cancers in mice, whereas the majority cells did not.
Since these two pieces of work, the list of cancer stem cells has multiplied. It now includes tumours of the breast, brain, prostate, colon, pancreas, ovary, lung, bladder, head and neck, as well as melanoma, sarcoma, AML, chronic myelogenous leukaemia, Hodgkin’s lymphoma and myeloma.
That is quite a list. The question is, what can be done with it? Jeremy Rich, a neurologist at Duke University in Durham, North Carolina, has one idea. He created mice that had human glioblastoma tumours, a form of brain cancer, growing in them. Then he treated these mice with radiation (the standard therapy for such cancer in people). He found that the cancer stem cells were more likely to survive this treatment than the other cells in the tumour. That turned out to be because, although all the tumour cells suffered equal amounts of DNA damage from the radiation, the stem cells were better able to repair this damage. When he treated the mice simultaneously with radiation and with a drug that interferes with DNA repair, however, the stem cells no longer had an advantage. They were killed by the radiation along with the other cells.
If that result applies to people as well as rodents, it opens up a whole avenue of possibility. In fact, Dr Rich is now in negotiations with several companies, with a view to testing the idea in humans. That “if” is a real one, though. A mouse is not a human being.
Indeed, the stem-cell hypothesis is often criticised for its reliance on animal models of disease. Some researchers worry that the experiments used to identify putative cancer stem cells are too far removed from reality—human tumour cells do not naturally need to survive in mice—and thus may not reflect human cancer biology at all.
Proponents of the hypothesis are alive to that concern, but they think that the same pattern has been seen so often in so many different cancers that it is unlikely to be completely wrong. The practical test, though, will be whether the hypothesis and ideas that emanate from it, such as Dr Rich’s combination therapy, actually help patients survive.
As a step towards discovering whether they do, William Matsui, an oncologist at Johns Hopkins University School of Medicine in Baltimore, looked for cancer stem cells in pancreatic-tumour samples taken from nearly 300 patients. His team found that patients whose tumours did contain such stem cells survived for an average of 14 months. Those whose tumours lacked them survived for 18 months.
That finding is consistent with the idea that cancer stem cells contribute to the most aggressive forms of the disease, though it does not prove they cause tumours in the first place. And although the absence of detectable stem cells in some tumours may be seen as casting doubt on the whole idea, it may instead be that they are too rare to be easily detected. If the stem-cell idea is confirmed, it may help doctors and patients choose how to treat different tumours. Those with detectable stem cells might be candidates for aggressive chemical and radiation therapies, while those without might best be treated with the surgeon’s knife alone.
Breast-cancer researchers are also testing the stem-cell hypothesis in the clinic. Jenny Chang’s group at Baylor College of Medicine, in Texas, took samples of tumours from women before and after standard chemotherapy. When they counted the cells in the tissue they found that the proportion of stem cells in a tumour increased after treatment. That suggests the chemotherapy was killing the non-stem tumour cells and leaving the stem cells behind. When the group repeated the experiment using a modern drug called Tykerb that blocks what is known as the HER2 pathway, they got a different result. HER2 is a gene which encodes a protein that acts as a receptor for molecules called growth factors which, as their name suggests, encourage cell growth and proliferation. After the HER2-blocking treatment, cancer stem cells formed the same proportion of the residual tumour as beforehand. That suggests they, too, were being clobbered by the new treatment. It is probably no coincidence that another drug, Herceptin, which also goes after HER2, is one of the few medicines that is able to prolong the lives of people with advanced cancer.
The stem-cell hypothesis has also changed the way people do basic research. For example, over the past few years cancer researchers have been grinding up pieces of tumour and using what are known as gene-expression microarrays to work out which genes are active in them. However, if the hypothesis is correct, this approach will probably yield the wrong result, because the crucial cells make up but a small part of a tumour’s bulk and the activity of their genes will be swamped by that of the genes of the more common non-stem cells. The answer is to isolate the stem cells before the grinding starts.
This approach has already yielded one important finding. When Dr Chang used microarrays to study gene expression in the CD44-positive cells from breast tumours, she noticed that they did not look like those of the epithelial cells that make up the bulk of such a tumour. Epithelial cells are immobile, grow in “cobblestone” patterns and produce proteins that help them stick together. The gene expression of the putative stem cells, however, resembled that of a mesenchymal cell. Mesenchymal cells rarely stick together. Indeed, they are mobile and are able to slip through the matrix of proteins that holds epithelial cells together.
That finding is important because mobile cells are more likely to escape from a tumour and form secondary cancers elsewhere in the body. Once such secondaries are established, successful treatment is much harder. And the CD44-positive cells also expressed genes that are important for stem-cell self-renewal, particularly one called Notch that controls the flow of chemical signals within a cell.
Researchers at OSI Pharmaceuticals, a firm that makes a drug called Tarceva, found a similar pattern in lung cancer. Several years ago, they started looking for gene-expression patterns that correlated with response to Tarceva. They found that tumours with a pattern that resembled epithelial cells were sensitive to the drug. By contrast, those that had a mesenchymal pattern were not. They hypothesised that as tumours develop, some of their cells actually switch from a sticky, epithelial state to a mobile, mesenchymal one. This epithelial-to-mesenchymal transition, or EMT, is well known to biologists who study embryonic development, but OSI’s results, and those of other researchers, suggest that cancers may have hijacked it for their own use.
Robert Weinberg, a molecular biologist at the Massachusetts Institute of Technology, and his colleagues have come to the same conclusion but they have taken the hypothesis one step further. They think that tumour cells which have undergone EMT have acquired many of the characteristics of cancer stem cells. Experiments in his laboratory, employing a variety of animal models of breast cancer, suggest that communication between tumour cells and surrounding non-cancerous support cells can lead some of the cancer cells to undergo EMT.
That is intriguing. If this transition really can be induced in tumour cells, then any of them might be able to become a cancer stem cell. So it may be that the fundamentalist version of the stem-cell hypothesis is wrong, and the stem cells are a result of a cancer, rather than its cause. That could be another reason why Dr Matsui found that pancreatic cancers do not always seem to contain stem cells.
Dr Weinberg is sensitive to this point, and is cautious when talking about these experiments. He refers to the cells that have undergone EMT as “having the qualities of stem cells” but avoids actually calling them cancer stem cells. If his idea is correct, though, it means that finding drugs which block the signals that induce EMT could reduce the stem-cell population and prolong the survival of the patient. It also means that both the epithelial cells and the mesenchymal ones will have to be attacked. And OSI is now testing a drug that does just that.
Breast-cancer researchers, too, are testing drugs that hit molecular targets highlighted by cancer-stem-cell studies. Merck, for example, has turned to a drug it originally developed to treat Alzheimer’s disease. Although this drug, code-named MK0752, did not slow that disease, it does block activity of Notch, the stem-cell self-renewal gene, and might thus be an appropriate weapon against breast-cancer stem cells. Dr Chang and Dr Wicha have started a clinical trial which uses MK0752 in combination with standard chemotherapy. By the end of the year they hope to have some idea of whether the combination kills cancer cells in human tumours.
Attacking Notch is a high-risk approach, because this gene is used by healthy stem cells as well as cancerous ones; healthy organs as well as tumours could be damaged. Some researchers are therefore taking a different tack and looking for drugs that hit only the unhealthy stem cells. Craig Jordan, a biologist at the University of Rochester Medical Centre, in New York state, is one such. He has discovered that a chemical called parthenolide, found in feverfew, a medicinal plant, kills AML stem cells. Normal stem cells, however, seem to be able to tolerate the drug without difficulty. The reason is that the leukaemia cells are reliant on a biochemical pathway that parthenolide blocks, whereas normal stem cells are not. If all goes well, a trial to test the safety of a modified form of parthenolide will start in a few months.
If the safety issues can be dealt with—and most researchers think they can—then attacking cancer stem cells really could help patients survive. If, that is, the stem-cell hypothesis is correct.
At the moment, scientists being scientists, few are willing to be anything other than cautious. They have seen too many past cures for cancer vanish in a puff of smoke. The proof needs to come from patients—preferably with them living longer. But if the stem-cell hypothesis is indeed shown to be correct, it will have the great virtue of unifying and simplifying the understanding of what cancer is. And that alone is reason for hope.
(from The Economist)
Monday, September 15, 2008
The first time I heard the word "pizza" was in the film ET; Elliot´s elder brother ordered a "pizza" over the phone and I was wondering what was that...
Then, when I became a teenager I was on holidays and I ate my first slide of pizza with my lovely friend... that was a real discovery!
So to you, my friend - for pizzas (and a beer!).
Sunday, September 14, 2008
David Foster Wallace is the award winning author of several novels, more than a few short stories, and numerous articles, as well as being a college professor. Still in his 30's, David has been called one of America's most important young authors and is often compared to Thomas Pynchon, though he tends to shrug off that association. He is most widely known for his epic (1000+ page) novel, Infinite Jest, published in 1996 and critically acclaimed by critics and readers alike. Topics covered in Wallace's work are wide ranging, but he seems to have a special interest in American culture, addictions, and excess. Ironically, because of his edgy body of work and his public persona, DFW has gained a cult following and become somewhat of a celebrity himself...
... but I Didn`t know him.
Tuesday, September 09, 2008
Monday, September 08, 2008
A Beloved Professor Delivers The Lecture of a Lifetime
Randy Pausch, a Carnegie Mellon University computer-science professor, was about to give a lecture Tuesday afternoon, but before he said a word, he received a standing ovation from 400 students and colleagues.
He motioned to them to sit down. "Make me earn it," he said.
They had come to see him give what was billed as his "last lecture." This is a common title for talks on college campuses today. Schools such as Stanford and the University of Alabama have mounted "Last Lecture Series," in which top professors are asked to think deeply about what matters to them and to give hypothetical final talks. For the audience, the question to be mulled is this: What wisdom would we impart to the world if we knew it was our last chance?
It can be an intriguing hour, watching healthy professors consider their demise and ruminate over subjects dear to them. At the University of Northern Iowa, instructor Penny O'Connor recently titled her lecture "Get Over Yourself." At Cornell, Ellis Hanson, who teaches a course titled "Desire," spoke about sex and technology.
At Carnegie Mellon, however, Dr. Pausch's speech was more than just an academic exercise. The 46-year-old father of three has pancreatic cancer and expects to live for just a few months. His lecture, using images on a giant screen, turned out to be a rollicking and riveting journey through the lessons of his life.
He began by showing his CT scans, revealing 10 tumors on his liver. But after that, he talked about living. If anyone expected him to be morose, he said, "I'm sorry to disappoint you." He then dropped to the floor and did one-handed pushups.
|Randy Pausch and his three children, ages 5, 2 and 1.|
Clicking through photos of himself as a boy, he talked about his childhood dreams: to win giant stuffed animals at carnivals, to walk in zero gravity, to design Disney rides, to write a World Book entry. By adulthood, he had achieved each goal. As proof, he had students carry out all the huge stuffed animals he'd won in his life, which he gave to audience members. After all, he doesn't need them anymore.
He paid tribute to his techie background. "I've experienced a deathbed conversion," he said, smiling. "I just bought a Macintosh." Flashing his rejection letters on the screen, he talked about setbacks in his career, repeating: "Brick walls are there for a reason. They let us prove how badly we want things." He encouraged us to be patient with others. "Wait long enough, and people will surprise and impress you." After showing photos of his childhood bedroom, decorated with mathematical notations he'd drawn on the walls, he said: "If your kids want to paint their bedrooms, as a favor to me, let 'em do it."
While displaying photos of his bosses and students over the years, he said that helping others fulfill their dreams is even more fun than achieving your own. He talked of requiring his students to create videogames without sex and violence. "You'd be surprised how many 19-year-old boys run out of ideas when you take those possibilities away," he said, but they all rose to the challenge.
He also saluted his parents, who let him make his childhood bedroom his domain, even if his wall etchings hurt the home's resale value. He knew his mom was proud of him when he got his Ph.D, he said, despite how she'd introduce him: "This is my son. He's a doctor, but not the kind who helps people."
He then spoke about his legacy. Considered one of the nation's foremost teachers of videogame and virtual-reality technology, he helped develop "Alice," a Carnegie Mellon software project that allows people to easily create 3-D animations. It had one million downloads in the past year, and usage is expected to soar.
"Like Moses, I get to see the Promised Land, but I don't get to step foot in it," Dr. Pausch said. "That's OK. I will live on in Alice."
Many people have given last speeches without realizing it. The day before he was killed, Martin Luther King Jr. spoke prophetically: "Like anybody, I would like to live a long life. Longevity has its place." He talked of how he had seen the Promised Land, even though "I may not get there with you."
Dr. Pausch's lecture, in the same way, became a call to his colleagues and students to go on without him and do great things. But he was also addressing those closer to his heart.
Near the end of his talk, he had a cake brought out for his wife, whose birthday was the day before. As she cried and they embraced on stage, the audience sang "Happy Birthday," many wiping away their own tears.
Dr. Pausch's speech was taped so his children, ages 5, 2 and 1, can watch it when they're older. His last words in his last lecture were simple: "This was for my kids." Then those of us in the audience rose for one last standing ovation.
by Jeffrey Zaslow
Sunday, September 07, 2008
Saturday, September 06, 2008
He teeters along the crumbling top
of the garden wall and calls, "Look up,
Papa, look up! I'm flying . . ." till,
in a sudden foreseen spasm, I see him fall.
when fear cries to the senses, when the whirl
of the possible drowns the real. Falling
is a fright in me. I call
and move in time to catch
his small, sweat-beaded body,
still thrilled with the air.
"I flew, Papa, I flew!"
"I know, child, I know."
By Alastair Reid
Thursday, September 04, 2008
Wednesday, September 03, 2008
- The Three Wise Men.
- The Three Musketeers (and D´Artagnan).
- The Holy Trinity.
- The three Olympic medals.
- The three crosses.
- The Three Piglets.
- ...and three is a crowd.
- Jupiter, Pete and Bob, the Three Detectives.
- Menage A Troi.
Three years already since I started this blog... thanks to the friend who inspired me!