Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Monday, December 26, 2011

Evolution of Press in UK

A million iPads and Kindles may have been unwrapped on Sunday – according to tentative analyst estimates – an influx of portable technology that is expected to hasten a decline in the already faltering sales of printed newspapers, adding pressure on traditional business models that have traditionally supported so many titles around the country.

Publishers, preparing for the handheld arrivals, took the chance to break with a tradition that dates back to 1912, when publishers agreed not to produce Christmas Day papers to give paperboys, among others, a day off. For the first time in its 190-year history the Sunday Times published a digital-only edition on 25 December – with the normally paid for product given away in the hope of luring sought after digital subscribers.

Boxing Day publication, for dailies like the Guardian, has also become a necessity – to ensure digital editions for new Kindle and iPad owners to read. The result is that what was a traditionally quiet period for news has become a critical moment to showcase new work, at a time when an industry already riven by the phone-hacking scandal and under judicial examination, is facing what can be described as an existential crisis.

Fifty years ago two national dailies – the Daily Mirror and the Daily Express – sold more than 4m copies each; today the bestselling Sun sells 2.6m. In the last year alone, printed sales declined by 10% for daily broadsheets and by 5% for daily tabloids – and when the News of the World stopped printing last July 600,000 copy sales simply disappeared.

The knock-on impact of the decline has been a push for digital readers that have seen newspapers like the Daily Mail win 5m unique visitors a day – compared with its printed sale of 2m – but struggle to generate revenues to match. The Mail generated £16m from its website last year, out of £608m overall.

Some specialist titles, such as the Financial Times, are managing the transition well – it has 260,000 digital subscribers – up 40% this year – compared with 337,000 buyers of the printed product, where sales are down by 12%. Digital subscribers generate £180 a year and the paper, priced at £2.50 on the newsstand on a weekday, is profitable.

John Ridding, the managing director, says that 30% of the FT's revenues come from digital sales and that "within two or three years" digital readers and revenues will account for more than those from the printed business. During a typical week the number of people signing on digitally is "five to 10 times" what it was a year earlier, as the newspaper looks to a future beyond print.

Others, though, are under pressure. Local newspapers have been hit particularly hard, with 31 titles closing in the last year. Most of those shutting are freesheets – with titles distibuted in Yeovil, Scarborough and Harlow lost. Historic paid for titles have seen their frequency cut: the Liverpool Daily Post is to go weekly in print in the new year, after sales dropped as low as 6,500. Its website, however, will update in real time. Daily titles in Birmingham and Bath have also gone the same way in recent years – while pre-tax profits at Johnston Press, the owner of the Scotsman and the Yorkshire Post, fell from £131.5m five years ago to £16m last year.

Roger Parry, chair of Johnston Press since 2009, believes the party has been over for several years, since Craigslist and Google began to take classified advertising away from local press.

"I think the future is for local multimedia companies which focus on signing up 50% plus of the households in their area on some form of subscription – that's what happens in Scandinavia," he says. For journalists there will have to be a shift from acting as "print writers to multimedia curators. There will be more content created by local people. The National Union of Journalists will hate this but it is fact of life."

With the culture secretary, Jeremy Hunt, wanting to license local television stations in 20 cities, that gives local media a new way to reach audiences, although some – such as Witney TV in Oxfordshire – have already made a start with a daily offering of local video news. David Cameron, the local MP, regularly appears, but the site is staffed by volunteers, and its content limited – underlining how tough the digital economics are.

There are commercial pressures in national media too. Although the tabloid media have faced criticism at the Leveson inquiry, not least from the likes of Hugh Grant or Steve Coogan, popular titles remain in fair commercial health. Trinity Mirror's stable of nationals – the Daily and Sunday Mirror, the People, and the Record titles in Scotland – will earn about £70m this year, although they made £86m the year before. The profit margin at the Daily Mail hovers at around 10%.

The challenge for the popular press is retaining printed sales – but the financial pressure is acute elsewhere. Three of the traditional broadsheets – the Independent, the Times and the Guardian – all lose money in a market where five titles compete for 1.3 million print buyers. Their readers are more likely to make the digital transition too, leaving newspapers no option but to embrace new forms of reporting – such as the live blog – and seed content at digital hubs, such as Facebook.

The Guardian may generate £40m in digital revenues from its largely free offerings, but some of that comes from its dating sites. The Times titles have gone for a low price subscription model, which has attracted 111,000 takers, but which generates £11m a year against an editorial budget estimated at £100m.

Some, like Paul Zwillenberg, from Boston Consulting Group, says serious newspapers "will have to cut their cloth because there will be a smaller pool of revenue and profit". But he acknowledges that by pursuing different business models, they may increase their chances of success. The result, though, is that was once an industry of one business model: a printed product sold on the newstand is fracturing into very different types of mainly digital content companies.


Thursday, October 06, 2011

Monday, August 29, 2011

Selling Content - That's a Job

Ten years is, of course, a long time in media. Ten years ago, if you wanted to download some music, your best bet was Napster or one of the filesharing systems such as LimeWire or KaZaA. There were legal services, but they were so dire they wouldn't pass much muster today: there was PressPlay and MusicNet (from rival groups of record companies), which required $15 a month subscriptions for low-quality streaming (when most people had dialup connections, not today's broadband). You couldn't burn to CD. They were stuffed with restrictive software to prevent you sharing the songs.

What happened? Steve Jobs happened, mainly. The hardware and design team at Apple came up with the iPod (initially intended to be a way to sell more Macintosh computers), and then followed the iTunes Music Store – a great way to tie people to Apple by selling music. In 2003 Jobs persuaded the music companies – which wouldn't license their songs to bigger names like Microsoft – to go with him because, he said, Apple was tiny (which it was, at the time). The risk if people did start sharing songs from the store was minimal, he argued. The record labels looked at Apple's tiny market share (a few per cent of the PC market) and reckoned they'd sell about a million songs a year, so they signed up.

Apple sold a million in the first week of the iTunes Music Store being open (and only in the US). It sold 3m within a month. It's never looked back.

Nowadays Apple sells TV shows, films, books, apps, as well as music. We take the explosion in available content for granted. But without Jobs, it's likely we wouldn't be here at all; his negotiating skill is the thing that Apple, and possibly the media industry, will miss the most, because he got them to open up to new delivery mechanisms.

Content companies have been reluctant to let their products move to new formats if they aren't the inventors, or at least midwives. Witness Blu-ray, a Sony idea which wraps up the content so you can't ever get it off the disc (at least in theory); or 3D films. Yet neither is quite living up to its promise, and part of that comes down to people wanting to be able to move the content around – on an iPod, iPhone, iPad or even a computer – in ways the content doesn't allow. Apps downloaded directly to your mobile? Carriers would never have allowed it five years ago. Flat-rate data plans? Ditto. But all good for content creators.

Jobs pried open many content companies' thinking, because his focus was always on getting something great to the customer with as few obstacles as possible. In that sense, he was like a corporate embodiment of the internet; except he thought people should pay for what they got. He always, always insisted you should pay for value, and that extended to content too. The App and Music Store remains one of the biggest generators of purely digital revenue in the world, and certainly the most diverse; while Google's Android might be the fastest-selling smartphone mobile OS, its Market generates pitiful revenues, and I haven't heard of anyone proclaiming their successes from selling music, films or books through Google's offerings.

Jobs's resignation might look like the end of an era, and for certain parts of the technology industry it is. For the content industries, it's also a loss: Jobs was a champion of getting customers who would pay you for your stuff. The fact that magazine apps like The Daily haven't set the world alight (yet?) isn't a failure of the iPad (which is selling 9m a quarter while still only 15 months old; at the same point in the iPod's life, just 219,000 were sold in the financial quarter, compared with the 22m – 100 times more – of its peak). It's more like a reflection of our times.

So if you're wondering how Jobs's departure affects the media world, consider that it's the loss of one of the biggest boosters of paid-for content the business ever had. Who's going to replace that?

by Charles Arthur


Sunday, August 28, 2011

SanFran is not The Big Apple


Steve Jobs in his Los Angeles office in 1981, five years after he co-founded Apple.
Photograph: Tony Korody/Corbis

Steve Jobs's resignation was the most discussed in corporate history. Because his illness has been public knowledge for so long, and because Wall Street and the commentariat viewed his health as being synonymous with that of his company, for years Apple share prices have fluctuated with its CEO's temperature. If all the "Whither Apple without Jobs?" articles were laid end to end, they would cover quite a distance – but they never reached a conclusion.

Still, you could understand the hysteria. After all, he's the man who rescued Apple from the near-death experience it underwent in the mid-1990s. When he came back in 1996, the company seemed headed for oblivion. Granted, it was a distinctive, quirky outfit, but one that had been run into the ground by mediocre executives who had no vision, no strategy – and no operating system to power its products into the future.

Jobs came back because Apple bought NeXT, the computer workstation company he had started after being ousted by the Apple board in 1985. By acquiring NeXT, Apple got two things: the operating system that became OS X, the software that underpinned everything Apple has made since; and Jobs as "interim CEO" at a salary of $1 a year. But it was still a corporate minnow: a BMW to Microsoft's Ford. Fifteen years later, Apple had become the most valuable company in the world.

It was the greatest comeback since Lazarus. Because only an obsessive, authoritarian, visionary genius could have achieved such a transformation, it's easy to see why Wall Street has had difficulty imagining Apple without Jobs. He was, after all, the only CEO in the world with rock star status. And Apple is a corporate extension of his remarkable personality, much as Microsoft was of Bill Gates's. But Jobs has something Gates never had – a reputation so powerful as to create a reality distortion field around him.

This field has blinded people to some under-appreciated facts. While it is true, for example, that Apple – under Jobs's influence – is probably the world's best industrial design outfit, it is also a phenomenally well-run company. Proof of that comes from various sources. For example, not only does it regularly dream up beautiful, functional and fantastically complex products, but it gets them to market in working order, on time and to budget; and it has continually done so despite exploding demand. Compare that with slow-motion car crashes such as Hewlett Packard's Touchpad, RIM's BlackBerry Playbook or Microsoft's Vista operating system.

Then there's the way that Apple – in the teeth of industry scepticism – made such an astonishing success of bricks-and-mortar retailing with its high-street stores. Or ponder the fact that it became the world's most valuable corporation without incurring a single cent of debt. Instead, it sits atop a $78bn (£48bn) cash mountain: enough to buy Tesco and BT and still have loose change. Compare that with the casino capitalism practised by so many MBA-educated company leaders in the US. And finally there is the stranglehold Apple now has on a number of crucial modern markets – computers, online music, mobile devices and smartphones.

If you ask people what Steve Jobs is best remembered for, most will name a particular product. If they're from my (baby boomer) generation, it will probably be the Apple Macintosh, a computer that changed many of our lives in the 1980s. Younger generations will credit him with the iMac, iTunes and the iPod. Today's teenagers will revere him for the iPhone. But there's a good argument that Jobs's greatest creation is Apple itself in its post-1996 incarnation. If that's true, the great test of his career legacy is whether the organisation he built around his values will endure and remain faithful to them.

What are those values? He usually expressed them as aphorisms and, as news of his resignation spread , people began raking through them for clues. Many focused on what he said to John Sculley, CEO of Pepsi, when he was trying to persuade him to run Apple.

"Do you want to spend the rest of your life selling sugar water," he asked, "or do you want to change the world?" (Sculley accepted the invitation, then presided over Jobs's expulsion.) But for Jobs it was a serious question. What he was asking, as the blogger Umair Haque put it, was: "Do you really want to spend your days slaving over work that fails to inspire, on stuff that fails to count, for reasons that fail to touch the soul of anyone?"

Jobs is famously fanatical about design. In part this is about how things look (though for him it also involves simplicity of use). When the rest of the industry was building computers as grey, rectangular metal boxes, for example, he was prowling department stores and streets looking for design metaphors. For a time he thought the Mac should be like a Porsche. At another stage he wanted it to be like the Cuisinart food-processor. When the machine finally appeared in 1984, Jack Tramiel, the grizzled macho-boss of Commodore, thought it looked like a girly device that would be best sold in boutiques. What Tramiel did not realise – and Jobs did – was that ultimately computers would be consumer products and people would pay a huge premium for classy design.

In that sense he is the polar opposite of the MBA-trained, bean-counting executive. "The cure for Apple is not cost-cutting," he said in 1996, when the company was on the rocks. "The cure for Apple is to innovate its way out of its current predicament." At another point he said: "When you're a carpenter making a beautiful chest of drawers, you're not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it."

This delight in elegant work has always been the most striking aspect of Jobs's celebrated speeches introducing new Apple products in San Francisco. As he cooed over the iMac or the iPhone or the iPad, words like "beautiful", "amazing" and "awesome" tumbled out. For once they didn't sound like cynical, manipulative corporation-speak. He spoke from the heart.

It goes without saying that he is impossible to work with; most geniuses are. Yet he has built – and retained the respect of – the most remarkable design team in living memory, a group that has been responsible for more innovation than the rest of the computer industry put together. For that reason, when the time comes to sum up Jobs's achievements, most will portray him as a seminal figure in the computing industry. But Jobs is bigger than that.

To understand why, you have to look at the major communications industries of the 20th century – the telephone, radio and movies. As Tim Wu chronicles it in his remarkable book, The Master Switch, each of these industries started out as an open, irrationally exuberant, chaotic muddle of incompatible standards, crummy technology and chancers. The pivotal moment in the evolution of each industry came when a charismatic entrepreneur arrived to offer consumers better quality, higher production values and greater ease of use.

With the telephone it was Theodore Vail of AT&T, offering a unified nationwide network and a guarantee that when you picked up the phone you always got a dial tone. With radio it was David Sarnoff, who founded RCA. With movies it was Adolph Zukor, who created the Hollywood studio system.

Jobs is from the same mould. He believes that using a computer should be delightful, not painful; that it should be easy to seamlessly transfer music from a CD on to a hard drive and thence to an elegant portable player; that mobile phones should be powerful handheld computers that happen to make voice calls; and that a tablet computer is the device that is ushering us into a post-PC world. He has offered consumers a better proposition than the rest of the industry could – and they jumped at it. That's how he built Apple into the world's most valuable company. And it's why he is really the last of the media moguls.

by John Naughton


Thursday, August 25, 2011

iResign



Thursday, August 04, 2011

Children Glued to Screens

Parents should be aware of the rise of "multiple screen" viewing among children on devices including televisions, smartphones and portable games consoles, when monitoring whether they are being active enough, researchers say.

A sedentary lifestyle – linked to spending lots of time watching TV and playing computer games – is thought to increase the risk of obesity and mental health problems, researchers at Bristol and Loughborough universities said in a study published on Wednesday.

They questioned 63 10- and 11-year-olds and found that children enjoyed looking at more than one screen at a time.

Russ Jago, of Bristol University's centre for exercise, nutrition and health sciences, said: "Health campaigns recommend reducing the amount of time children spend watching TV.

"However, the children in this study often had access to at least five different devices at any one time, and many of these devices were portable."

In the study the children would move the equipment between their bedrooms and family rooms, depending on whether they wanted privacy or company, said Jago. "This suggests that we need to work with families to develop strategies to limit the overall time spent multiscreen viewing wherever it occurs within the home."

The children would use a second device to fill in breaks during their entertainment, often talking or texting their friends during adverts or while waiting for computer games to load. The television was also used to provide background entertainment while they were doing something else – especially if the programme chosen by their family was considered "boring".

One respondent said: "I'm on my DSi [handheld games console] and my laptop. On my DSi I'm on MSN and on my laptop I'm on Facebook and then the TV is on."

There has needs to be more research into multi-screen viewing.

Jago said: "There is a shortage of information about the nature of contemporary screen viewing amongst children, especially given the rapid advances in screen viewing equipment technology and their widespread availability.

"For example, TV programmes are watched on computers, games consoles can be used to surf the internet, smartphones, tablet computers and hand-held games play music, video games provide internet access, and laptop computers can do all of the above."

The research paper, entitled I'm on it 24/7 at the moment: A qualitative examination of multiscreen viewing behaviours among UK 10-11 year olds is published in the International Journal of Behavioural Nutrition and Physical Activity.


Monday, July 18, 2011

The Guy Behind TED


Monday, July 11, 2011

Too Much Information

GOOGLE “information overload” and you are immediately overloaded with information: more than 7m hits in 0.05 seconds. Some of this information is interesting: for example, that the phrase “information overload” was popularised by Alvin Toffler in 1970. Some of it is mere noise: obscure companies promoting their services and even more obscure bloggers sounding off. The overall impression is at once overwhelming and confusing.

“Information overload” is one of the biggest irritations in modern life. There are e-mails to answer, virtual friends to pester, YouTube videos to watch and, back in the physical world, meetings to attend, papers to shuffle and spouses to appease. A survey by Reuters once found that two-thirds of managers believe that the data deluge has made their jobs less satisfying or hurt their personal relationships. One-third think that it has damaged their health. Another survey suggests that most managers think most of the information they receive is useless.

Commentators have coined a profusion of phrases to describe the anxiety and anomie caused by too much information: “data asphyxiation” (William van Winkle), “data smog” (David Shenk), “information fatigue syndrome” (David Lewis), “cognitive overload” (Eric Schmidt) and “time famine” (Leslie Perlow). Johann Hari, a British journalist, notes that there is a good reason why “wired” means both “connected to the internet” and “high, frantic, unable to concentrate”.

These worries are exaggerated. Stick-in-the-muds have always complained about new technologies: the Victorians fussed that the telegraph meant that “the businessman of the present day must be continually on the jump.” And businesspeople have always had to deal with constant pressure and interruptions—hence the word “business”. In his classic study of managerial work in 1973 Henry Mintzberg compared managers to jugglers: they keep 50 balls in the air and periodically check on each one before sending it aloft once more.

Yet clearly there is a problem. It is not merely the dizzying increase in the volume of information (the amount of data being stored doubles every 18 months). It is also the combination of omnipresence and fragmentation. Many professionals are welded to their smartphones. They are also constantly bombarded with unrelated bits and pieces—a poke from a friend one moment, the latest Greek financial tragedy the next.

The data fog is thickening at a time when companies are trying to squeeze ever more out of their workers. A survey in America by Spherion Staffing discovered that 53% of workers had been compelled to take on extra tasks since the recession started. This dismal trend may well continue—many companies remain reluctant to hire new people even as business picks up. So there will be little respite from the dense data smog, which some researchers fear may be poisonous.

They raise three big worries. First, information overload can make people feel anxious and powerless: scientists have discovered that multitaskers produce more stress hormones. Second, overload can reduce creativity. Teresa Amabile of Harvard Business School has spent more than a decade studying the work habits of 238 people, collecting a total of 12,000 diary entries between them. She finds that focus and creativity are connected. People are more likely to be creative if they are allowed to focus on something for some time without interruptions. If constantly interrupted or forced to attend meetings, they are less likely to be creative. Third, overload can also make workers less productive. David Meyer, of the University of Michigan, has shown that people who complete certain tasks in parallel take much longer and make many more errors than people who complete the same tasks in sequence.

Curbing the cacophony

What can be done about information overload? One answer is technological: rely on the people who created the fog to invent filters that will clean it up. Xerox promises to restore “information sanity” by developing better filtering and managing devices. Google is trying to improve its online searches by taking into account more personal information. (Some people fret that this will breach their privacy, but it will probably deliver quicker, more accurate searches.) A popular computer program called “Freedom” disconnects you from the web at preset times.

A second answer involves willpower. Ration your intake. Turn off your mobile phone and internet from time to time.

But such ruses are not enough. Smarter filters cannot stop people from obsessively checking their BlackBerrys. Some do so because it makes them feel important; others because they may be addicted to the “dopamine squirt” they get from receiving messages, as Edward Hallowell and John Ratey, two academics, have argued. And self-discipline can be counter-productive if your company doesn’t embrace it. Some bosses get shirty if their underlings are unreachable even for a few minutes.

Most companies are better at giving employees access to the information superhighway than at teaching them how to drive. This is starting to change. Management consultants have spotted an opportunity. Derek Dean and Caroline Webb of McKinsey urge businesses to embrace three principles to deal with data overload: find time to focus, filter out noise and forget about work when you can. Business leaders are chipping in. David Novak of Yum! Brands urges people to ask themselves whether what they are doing is constructive or a mere “activity”. John Doerr, a venture capitalist, urges people to focus on a narrow range of objectives and filter out everything else. Cristobal Conde of SunGard, an IT firm, preserves “thinking time” in his schedule when he cannot be disturbed. This might sound like common sense. But common sense is rare amid the cacophony of corporate life.

Tuesday, June 28, 2011

International Space Station

The International Space Station (ISS) is an internationally-developed research facility, which is being assembled in low Earth orbit and is the largest space station ever constructed. On-orbit construction of the station began in 1998 and is scheduled for completion by 2012. The station is expected to remain in operation until at least 2020, and potentially to 2028. Like many artificial satellites, the ISS can be seen from Earth with the naked eye. The ISS serves as a research laboratory that has a microgravity environment in which crews conduct experiments in biology, human biology, physics, astronomy and meteorology. The station has a unique environment for the testing of the spacecraft systems that will be required for missions to the Moon and Mars. The ISS is operated by Expedition crews, and has been continuously staffed since 2 November 2000—an uninterrupted human presence in space for the past &000000000000001000000010 years and &0000000000000238000000238 days. As of June 2011[update], the crew of Expedition 28 is aboard.

The ISS is a synthesis of several space station projects that includes the American Freedom, the Soviet/Russian Mir-2, the European Columbus and the Japanese Kibō. Budget constraints led to the merger of these projects into a single multi-national programme. The ISS project began in 1994 with the Shuttle-Mir program, and the first module of the station, Zarya, was launched in 1998 by Russia. Assembly continues, as pressurised modules, external trusses and other components are launched by American space shuttles, Russian Proton rockets and Russian Soyuz rockets. As of November 2009[update], the station consisted of 11 pressurised modules and an extensive integrated truss structure (ITS). Power is provided by 16 solar arrays mounted on the external truss, in addition to four smaller arrays on the Russian modules. The station is maintained at an orbit between 278 km (173 mi) and 460 km (286 mi) altitude, and travels at an average speed of 27,724 km (17,227 mi) per hour, completing 15.7 orbits per day.

Operated as a joint project between the five participant space agencies, the station's sections are controlled by mission control centres on the ground operated by the American National Aeronautics and Space Administration (NASA), the Russian Federal Space Agency (RKA), the Japan Aerospace Exploration Agency (JAXA), the Canadian Space Agency (CSA), and the European Space Agency (ESA). The ownership and use of the space station is established in intergovernmental treaties and agreements that allow the Russian Federation to retain full ownership of its own modules, with the remainder of the station allocated between the other international partners. The station is serviced by Soyuz spacecraft, Progress spacecraft, space shuttles, the Automated Transfer Vehicle and the H-II Transfer Vehicle, and has been visited by astronauts and cosmonauts from 15 different nations. The cost of the station has been estimated by ESA as €100 billion over 30 years, although other estimates range from 35 billion dollars to 160 billion dollars. The financing, research capabilities and technical design of the ISS program have been criticised because of the high cost.


Friday, May 20, 2011

MIT at 150



Tuesday, February 15, 2011

Tech News

Robots could soon have an equivalent of the internet and Wikipedia.

European scientists have embarked on a project to let robots share and store what they discover about the world.

Called RoboEarth it will be a place that robots can upload data to when they master a task, and ask for help in carrying out new ones.

Researchers behind it hope it will allow robots to come into service more quickly, armed with a growing library of knowledge about their human masters.

Share plan

The idea behind RoboEarth is to develop methods that help robots encode, exchange and re-use knowledge, said RoboEarth researcher Dr Markus Waibel from the Swiss Federal Institute of Technology in Zurich.

"Most current robots see the world their own way and there's very little standardisation going on," he said. Most researchers using robots typically develop their own way for that machine to build up a corpus of data about the world.

This, said Dr Waibel, made it very difficult for roboticists to share knowledge or for the field to advance rapidly because everyone started off solving the same problems.

By contrast, RoboEarth hopes to start showing how the information that robots discover about the world can be defined so any other robot can find it and use it.

RoboEarth will be a communication system and a database, he said.

In the database will be maps of places that robots work, descriptions of objects they encounter and instructions for how to complete distinct actions.

The human equivalent would be Wikipedia, said Dr Waibel.

"Wikipedia is something that humans use to share knowledge, that everyone can edit, contribute knowledge to and access," he said. "Something like that does not exist for robots."

It would be great, he said, if a robot could enter a location that it had never visited before, consult RoboEarth to learn about that place and the objects and tasks in it and then quickly get to work.

While other projects are working on standardising the way robots sense the world and encode the information they find, RoboEarth tries to go further.

"The key is allowing robots to share knowledge," said Dr Waibel. "That's really new."

RoboEarth is likely to become a tool for the growing number of service and domestic robots that many expect to become a feature in homes in coming decades.

Dr Waibel said it would be a place that would teach robots about the objects that fill the human world and their relationships to each other.

For instance, he said, RoboEarth could help a robot understand what is meant when it is asked to set the table and what objects are required for that task to be completed.

The EU-funded project has about 35 researchers working on it and hopes to demonstrate how the system might work by the end of its four-year duration.

Early work has resulted in a way to download descriptions of tasks that are then executed by a robot. Improved maps of locations can also be uploaded.

A system such as RoboEarth was going to be essential, said Dr Waibel, if robots were going to become truly useful to humans.


Wednesday, January 19, 2011

Wednesday, October 13, 2010

Sunday, September 19, 2010

(Un)Social Networks

It is perhaps inevitable given the rise of social networking sites such as Facebook and Twitter that the number of places blocking access to them is also growing. Burma, China, Iran, Harrisburg in Pennsylvania, the roll-call goes on and on.

Harrisburg in Pennsylvania? Can that be true? Can a town better known for its steel industry and agriculture than for internet censorship really have joined the list? For the past week the private Harrisburg University has instigated what it calls a "blackout" of all social networking sites. It has removed from its central server the channels that pipe social media, cutting off access to Twitter and Facebook, instant messaging services and video chat through Skype.

To be fair to the university, its action cannot be equated to those of the Burmese military junta or the ayatollahs of Iran. This is, after all, a modern science and technology college, opened to students five years ago, that offers specialist courses in use of the internet.

Rather, the idea was to undertake an experiment to find out what impact social media and multitasking were having on college life, its students and faculty alike. It was dreamed up by the university's provost, Eric Darr, who became intrigued when he observed his 16-year-old daughter at home one night. "She had Facebook open on her laptop, was listening to music on iTunes, had apps open on her iPhone and three different conversations going on instant messaging – all simultaneously," he said. "It struck me how overpowering all this was, not in a negative way, and it made me wonder what would happen if all that wasn't there."

On Monday morning the university closed channels to the social networking sites so no access could be gained via the university's central wireless system. The reaction of the 800 or so students ranged from curious to puzzled to outraged.

Darr was in the room when one student moaned that without Facebook on his laptop in class he didn't know what to do. Darr said: "I was standing right there, and said to him there's always the novel idea of paying attention to your professor." Alexis Rivera, an 18-year-old student of internet security, said she had been surprised by the effect of being deprived of her beloved instant messaging and Facebook. "It's a lot better," she said. "I can pay attention much better now."

As it is a laptop university, students have computers open at most lectures. In an average class, Rivera would have AOL, Yahoo, MSN and Skype instant messaging running, with up to seven chats going at the same time. "Normally I'd be chatting to other people in the class about how boring it was," she said. This week, without the distractions, she has found herself taking more notes and following the tutor with greater understanding. She has been doing more homework, as in the past she often missed assignments because she was so busy messaging she didn't hear them. And she's also become more outgoing. "I'm a lot more social," she said. "I talk to a lot more people, face to face, rather than sitting there typing away."

Others have been less enthusiastic. Several of the college's computer geeks have rerouted internet access through Canada or Norway or used proxy websites to break through the firewall. Some students have nipped to the nearby Hilton hotel to use its wireless access.

Giovanni Acosta, 21, knows how to overcome the blackout but decided against it, as he wanted to see the outcome of the experiment. At first he said he was twitchy. "I had to log on to Facebook even though I knew it was blocked, and I did that every 10 minutes or so, again and again," he said. "But now the itch has gone. I've learnt how much I was being distracted."



Tuesday, January 12, 2010

Friday, October 30, 2009

40 Years Old Already...


An internet that speaks to you

Progress towards making the net more multi-lingual is welcome says Bill Thompson.

It is 40 years to the week since the first data packets were sent over the Arpanet.

That was the research network commissioned by the US Department of Defense Advanced Research Projects Agency (Arpa) to see whether computer-to-computer communications could be made faster, more reliable and more robust by using the novel technique of packet switching instead of the more conventional circuit switched networks of the day.

Instead of connecting computers rather as telephone exchanges work, using switches to set up an electric circuit over which data could be sent, packet switching breaks a message into chunks and sends each chunk - or packet - separately, reassembling them at the receiving end.

Late on October 29 1969 Charley Kline sat down at a computer in the computer laboratory at UCLA, where he was a student, and established a link to a system at the nearby Stanford Research Institute, sending the first data packets over the nascent Arpanet.

This will finally allow users of these domains to have a domain name that is entirely in characters based on their native language, and marks an important point in the internationalisation of the whole internet.
Bill Thompson

Later in the year permanent links were made between four sites in the US, and over the following years the ARPANET grew into a worldwide research network.

Arpanet was one of the computer networks that coalesced into today's internet, and the influence of the standards and protocols established there can still be seen today, making this anniversary as important for historians of the network society as July's celebration of the 1969 Apollo 11 landing is for those who study space science.

Technology does not stand still, and over the years the way computers communicate with each other has changed enormously. Early Arpanet computers used the Network Control Protocol to talk to each other, but in 1983 this was replaced with the more powerful and flexible TCP/IP - the transmission control protocol and internet protocol.

Today we are in the process of migrating our networks from IP version 4 to IP version 6, which allows for more devices to be connected to the network and is more secure and robust, but work continues to improve and refine all aspects of the network architecture.

One area that is changing is the domain name system, DNS. This links the unique number that identifies every device on the internet with one or more names, making it possible to type in "www.bbc.co.uk" and go to the right web server without having to remember its number.

Designed by engineer Paul Mockapetris in 1983, DNS is a vital component of the network as well as the web, including e-mail and instant messaging. Every time a programme uses a name for a computer instead of a number, DNS is involved.

However DNS, like so much of the network's architecture, was developed by English-speaking westerners, and its original design only allowed standard ASCII characters to be used in names.

ASCII, the American Standard Code for Information Interchange, is a way of representing letters, numbers and punctuation in the binary code used by computers, and was originally based on old telegraphic codes.

It works really well for English, but had to be extended and updated to cope with other alphabets, and has now been replaced by the much more powerful and capable Unicode standard, able to represent non-Latin languages as well as those based on the Latin alphabet.

Being able to write in your own language is one thing, but it's also important to be able to have e-mail or website addresses that use it. Unfortunately the design of DNS meant that key aspects would not work with anything other than ASCII, making it impossible to simply add in Chinese or Arabic characters to domain.

Work has been going on since the mid 90's to change this and provide what are called "internationalized domain names", and many organisations are now able to have websites and email addresses that include Chinese, Cyrillic, Hebrew, Arabic and many other alphabets.

The process took a significant step forward this week when Icann, the international body that looks after domain names, fast-tracked a proposal to provide internationalised versions of two letter country domains, such as .uk and .jp.

This will finally allow users of these domains to have a domain name that is entirely in characters based on their native language, and marks an important point in the internationalisation of the whole internet.

It has taken a long time to make this happen, but the problems of re-engineering such a key part of the network infrastructure without breaking anything are enormous, and anyone who reads through the technical documentation will see just how complex the process has been.

And it was definitely necessary to do it properly - the fuss over the recent retuning of Freeview boxes in the UK was bad enough, but trying to persuade a billion internet users to update their software to support a new form of DNS would have been impossible.

Over the next five years the majority of new internet users will come from the non English-speaking world. It's good to see that those of us who have helped build the network so far are making it more welcoming for them.

Bill Thompson is an independent journalist and regular commentator on the BBC World Service programme Digital Planet.


Tuesday, October 27, 2009

21st Century Revolution

Try to think back to your life at the end of the last century. What was it like? Do you remember listening to music on CDs? Owning a phone primarily in order to make phone calls? Going to a library to search for information? Buying a map when embarking on a journey? Now compose a list of 10 ways in which your life has changed since. Here are mine:

1 Google In 2002, this was still a bunch of visionaries without a business plan. In just under four years, it went from earning nothing to earning $20bn a year. It accounts for 70% of all searches for information in the world. In October 2006, it splashed out $1.6bn for YouTube, another company without a business plan that was itself barely 18 months old. YouTube now notches up 1 billion searches a day. During the first nine years of the 21st century, American newspaper revenues declined by roughly 50%. Google is on a grandiose journey to digitise just about every word, painting, note, street, mountain, stream, ocean, book, newspaper, animal, insect, photograph and email that ever existed. "Google" passed into common parlance as a verb in around 2002. You google, I google, we all google. It is the most revolutionary word of the decade.

2 Wikipedia How do I know that the first recorded use of the verb "to google" was 8 July 1998, and that Google itself initially used lawyers to discourage the use of the word-as-verb? From Wikipedia – the half-baked, crazy idea of Jimmy Wales (and others) launched in January 2001. How could such a stupid notion – an encyclopedia written by anyone and everyone – ever work? Whatever next? Short answer: the English-language version now has 3 million articles and 1 billion words (which is – according to Wikipedia – 25 times the size of Encyclopaedia Britannica).

3 Twitter Another really stupid idea. As if anything worth saying could possibly be said in 140 characters. Who are these sad people who want to know that some other sad person is waiting for a bus or has just changed a nappy? OK, so there are roughly 18 million of them, but what on earth do they talk about? Yes, people still write/say that. Smarter people recognise that Twitter is one stage on from Google – applying human intelligence and recommendation to the ordering of information… in real time. It makes algorithms look so 1990s.

4 Comment Is Free A plug for the home team here. Launched in March 2006, CiF inverted the traditional model of newspaper comment. That model went as follows: a small number of columnists opined on politics, culture and events. The readers responded by letter or email: a tiny proportion of that response saw the light of day. This assumed if you were, for example, a New York Times reader, that Thomas L Friedman was the one voice you wanted to hear on Venezuela, the Middle East, Russia, Rwanda, Italy, China and Afghanistan. Comment Is Free started from the assumption that – with intelligent editing and moderating – there were thousands of people with voices and opinions worth hearing and that something powerful, plural and diverse could be forged from combining a newspaper's columnists with those other voices. Millions read it every month; around 100,000 actively take part. Work in progress, but it's difficult to imagine ever going back to the old model.

5 BBC iPlayer Launched at the end of 2007 and – again – already it's impossible to imagine life without the ability to view, or listen to, programmes in your own time. And, of course, that leads to the really radical thought that one day anyone will be able to access any BBC content created at any point in the last 70 years via the iPlayer… for free. Unless James Murdoch has really got David Cameron and Jeremy Hunt in some kind of headlock, in which case you'll have to pay for it. Assuming the BBC still exists in any recognisable form.

6 iPhone Launched in June 2007. Can you remember the moment when you first held one? The involuntary gasp as you saw what it could do? The touch screen, the rotating screen, the zooming screen! The satellite street view, the maps, the iPod and phone all in one slick sliver of beauty. And that was just the start. Jonathan Zittrain, the enormously brainy Oxford-and-Harvard web guru, denounced the original iPhone as "sterile" – on the basis that it was a device that only Apple could improve or change. That was way back in June 2008, since when 1 billion iPhone apps have been downloaded and the only limit to what a mobile phone could become is human imagination itself.

7 Craigslist Until 2000, this classified advertising database existed only in San Francisco. Come the new millennium, it expanded into nine more US cities. It is now in nearly 600 cities in more than 50 countries (thank you, Wikipedia). I remember a Guardian Media Group board meeting when one of our in-house digital gurus patiently explained its business model – essentially, free to both advertiser and reader. It then operated from a small building in San Francisco and had 17 employees. I sat there thinking, "This is the beginning of the end for local newspapers." Nothing has happened since to change my mind.

8 Facebook All the silly things people say about Twitter (see above), they said about Facebook. And still do. What a pointless waste of time! Who are these people with empty lives? Etc etc. It is so pointless that there are now more than 300 million people active on the site, doing their pointless things. But, really, to think that, you would have to think it was pointless to want to connect, to create, to share creativity or thought, to discuss, to collaborate, to form groups or to combine with others in mutual interests or passions. If you can't see the point of any of those things, you will not see the point of Facebook.

9 iTunes U Launched in May 2007 and still relatively unknown. The theory is that every university in the world – most of them benefiting from significant public funding – can share all their course lessons, lectures, language classes and laboratory demonstrations with everyone else. For those who haven't discovered it, you find it through iTunes itself. Tap in any search term you want, and it will deliver you content from all the partner universities, which you can then carry around with you on your iPod or iPhone (see above) to listen to on buses, at airports, on long car journeys or (if you are insomniac like me) throughout the night. Take the Workers' Educational Association (WEA) and the Open University, and multiply by 10,000.


10 Spotify Launched generally only eight months ago, Spotify is even more life-changing than iTunes, with a library of six million tracks, including a remarkable amount of really quite esoteric classical music. It is – at least at the time of writing – all free, providing you don't mind putting up with the occasional advertisement. Was there really a time when, in order to listen to a particular concerto or symphony, you had to either buy it or scan Radio Times to see if Radio 3 had scheduled it? Incredible to think of those dark ages. Now – so long as you don't mind not "owning" music – you can listen to more or less anything more or less any time. A small Eee PC, hooked up to your sound system, will cost no more than a mid-price tuner. A lifetime of musical exploration beckons.

I make no claims for all of these being the most significant developments in communication over the last decade. They are simply 10 of the inventions and launches that have most affected me. I would find it hard to imagine returning to a life without any of them. Between them they have created the greatest explosion of democracy; access to information; potential for creativity; and the ability to connect and communicate the world has ever seen. They are, each of them, profoundly disruptive and revolutionary, and with consequences that will ripple on through time future. Some (most?) may be transitory, to be replaced by even more transformatory innovations; some more permanent. In just under 10 years they – and millions of developments, technologies and websites like them – have changed the world profoundly.


by Alan Rusbridger


Wednesday, October 21, 2009

A Window Into The 7th Wonder?




Installation

The installation process took over an hour, but once up and running Windows 7’s advantages were impressive. Gathering information from 168,941 user files, settings and “programs” was time consuming for the automated installer, and so was expanding 2340mb worth of files. Google’s new Chrome browser installs in seconds, and while that is not an operating system, Microsoft will be aware that everybody’s impatient for their computers to be faster in all respects. However, at least the process was clearly saying what it was doing, and made it obvious that it hadn’t in fact crashed midway through.

Homegroups and touchscreens

Essentially, this is the operating system Vista should have been – it starts up relatively quickly, drivers already exist to make peripherals such as scanners and printers work with it, and it does clever things that XP, the version of Windows most people still use, just doesn’t.

The two facets that are at the heart of that are based around the “Homegroup”, and new ways of inputting and viewing information. The Homegroup means that you can group together a number of things, from MP3 players to other computers, and so as soon as you join a network, you can instantly see everything you’ve seen before. Music, films and files really are available immediately. This idea is certainly not new, but the breakthrough comes in simplifying the sharing process to accommodate the countless videos, digital pictures and albums that people have taken using digital technology. Windows Media Player, too, now supports far more file formats.

The new ways of dealing with information, meanwhile, mean that touchscreens are much better supported – in the future they will need to be – and that forthcoming applications based around combining television and the internet will slot in neatly, too. The same integration applies to online services such as social networking site Facebook, and, for instance, Windows Live with its shared picture galleries.

The icons here are bigger, too, but they don’t look patronising. The idea is that fat fingers should be as useful as the tiny pointer of a mouse. Shaking individual windows can be used to activate certain features, as well, which could be a natural move with a finger on a touchscreen, but not with the mouse.

Vista’s desktop, improved

On many machines, Vista’s desktop quickly seemed to run slowly. Software updates fixed some issues, but at least three computers I’ve used didn’t seem to respond very effectively. Vista separates its gadgets out from the bar that previously occupied the right-hand side of the screen, and consequently feels considerably more friendly. The difference isn’t huge, aesthetically, but functionally it’s significant.

From netbooks to desktops

Another crucial improvement for Windows 7 over Windows Vista is that 7 will run on a range of machines, from low-specification netbooks to high-powered desktops. On our mid-range Studio Hybrid, 7 was certainly more impressive than Vista, although there were the bugs that are acceptable in a test beta version, especially to do with the anti-virus programmes that are currently still very specific to Vista.

Micorosoft say that the operating system will be happy on, for instance, an Asus EeePC. Given that the EeePC sometimes struggles to run its cut-down version of Linux, that’s quite some claim, and I’d be eager to see how true it really is.

Intelligent power management

There are substantial improvements in power management, too, with ports being turned off when they’re not in use, and so this is an operating system that does what it should: unlike Vista, it shuts up, and keeps things going while you, the user, can get on with whatever it is you need to be doing. Batteries will live longer, but the programme learns from your behaviour too. If the display is set to dim after 30 seconds and you move the mouse immediately it does, then you’ll get significantly longer before the computer dims its display again. So there’s time, say, to read something quite lengthy on screen without constant irritation.

But should you really go and download a beta?

Senior people at Microsoft say they’ve been using the current build of Windows 7 on their main machines for a few days now and that there have been no problems. Indeed, it is certainly true that the developer community is making surprisingly positive noises about Windows 7, especially when compared to Vista. So people will be taking the plunge in droves, right?

Geeks will, indeed. But does Windows 7 do anything you need so much it’s worth the risk? The answer is almost certainly no.

It offers impressive connectivity, it looks slick to the (Apple?) core, but, for instance, persuading it to ignore warnings about McAfee’s Security Centre being incompatible with this new version of Windows was difficult and annoying. For some unknown reason, too, web access sometimes just ground to a halt and crashed Internet Explorer. No computer or wifi network is perfect, but a consumer should be able to know that such problems are not in their browser, and with a beta you simply don’t have that kind of certainty.

Conclusions

Microsoft has learnt a lot from Vista. The PR for Windows 7 is already better, and the company, more than in the past, is keen to constantly remind users that Friday’s release is only of a test version. It helps, of course, that in truth Vista is now the operating system it should have been when it launched. People aren’t desperate for an upgrade, as they were in the last days of XP.

What Microsoft has realised primarily, though, is that Windows 7 needs to make everything easier – playing music, joining networks, sharing photos should all feel simpler than they do currently. The good news is that with this beta they already do; if Microsoft can really deliver on that vision in the full release, then Windows 7 should be a formidable programme indeed.




Wednesday, August 26, 2009

A Very Polished Photo

Align Centre

Sunday, August 02, 2009

Big Question, Wrong Answer?


The green movement's fixation with technology reveals that we are asking the wrong questions


How would you imagine an environmentalist would react when presented with the following proposition? A power company plans to build a new development on a stretch of wild moorland. It will be nearly seven miles long, and consist of 150 structures, each made of steel and mounted on hundreds of tons of concrete. They will be almost 500 feet high, and will be accompanied by 73 miles of road. The development will require the quarrying of 1.5m cubic metres of rock and the cutting out and dumping of up to a million cubic metres of peat.

The answer is that if you are like many modern environmentalists you will support this project without question. You will dismiss anyone who opposes it as a nimby who is probably in the pay of the coal or nuclear lobby, and you will campaign for thousands more like it to be built all over the country.

The project is, of course, a wind farm – or, if we want to be less Orwellian in our terminology, a wind power station. This particular project is planned for Shetland, but there are many like it in the pipeline. The government wants to see 10,000 new turbines across Britain by 2020 (though it is apparently not prepared to support the Vestas wind turbine factory on the Isle of Wight). The climate and energy secretary, Ed Miliband, says there is a need to "grow the market" for industrial wind energy, and to aid this growth he is offering £1bn in new loans to developers and the reworking of the "antiquated" (ie democratic) planning system, to allow local views on such developments to be overridden.

Does this sound very "green" to you? To me it sounds like a society fixated on growth and material progress going about its destructive business in much the same way as ever, only without the carbon. It sounds like a society whose answer to everything is more and bigger technology; a society so cut off from nature that it believes industrialising a mountain is a "sustainable" thing to do.

It also sounds like an environmental movement in danger of losing its way. The support for industrial wind developments in wild places seems to me a symbol of a lack of connectedness to an actual, physical environment. A development like that of Shetland is not an example of sustainable energy: it is the next phase in the endless human advance upon the non-human world – the very thing that the environmental movement came into being to resist.

Campaigners in Cumbria are fighting a proposed wind development near the mountain known as Saddleback, a great, brown hulk of a peak which Wordsworth preferred to call by its Celtic name, Blencathra. Wordsworth thought the wild uplands a place of epiphany. Other early environmentalists, from Thoreau to Emerson, knew too of the power of mountain and moor to provide a clear-eyed and humbling view of humanity.

Many of today's environmentalists will scoff if you speak to them of such things. Their concerns are couched in the language of business and technology – gigawatt hours, parts per million of carbon, peer-reviewed papers and "sustainable development". The green movement has become fixated on a single activity: reducing carbon emissions. It's understandable, what the science tells us about the coming impacts of climate change is terrifying. But if climate change poses a huge question, we are responding with the wrong answers.

The question we should be asking is what kind of society we should live in. The question we are actually asking is how we can power this one without producing carbon. This is not to say that renewable energy technologies are bad. We need to stop burning fossil fuels fast, and wind power can make a contribution if the turbines are sensitively sited and on an appropriate scale.

But the challenge posed by climate change is not really about technology. It is not even about carbon. It is about a society that has systematically hewed its inhabitants away from the natural world, and turned that world into a resource. It is about a society that imagines it operates in a bubble; that it can keep growing in a finite world, forever.

When we clamour for more wind-power stations in the wilderness, we perhaps think we are helping to slow this machine, but we are actually helping to power it. We are still promoting, perhaps unintentionally, the familiar mantras of industrial civilisation: growth can continue forever; technological gigantism will save us; our lives can go on much as they always have.

In the end, climate change presents us with a simple question: are we going to live within our means, or are we, like so many civilisations before us, going to collapse? In that question lies a radical challenge to the direction and mythologies of industrial society. All the technology in the world will not answer it.