|
My friend Simon is one of those net entrepreneurs with the attention to detail it takes to have an idea and turn it into an effective company. He’s currently on his second job search service, and it seems to be going very well.
One reason for the success may be that Simon has embraced the network age with a dedication that most of us can only wonder at. He uses a range of productivity tools, scheduling services and collaborative systems to manage both his personal and professional life, and once confessed to me that he had ‘outsourced his memory’ to Microsoft Outlook and its calendar service.
So far I’ve resisted the temptation to pay a team of hackers to break into his laptop and add ‘jump off a cliff’ as his 10am appointment on Thursday.
Recently I’ve noticed that Simon’s head is in the cloud. Or rather, his business is, as he and his team have moved most of their systems online, taking advantage of the move from local storage and processing to ‘cloud computing’, where data and services are provided online and accessed from a PC or any other device.
For a small but growing business it means that new storage and processing capacity can be added incrementally instead of having to buy a whole new server at a time.
And for a distributed company like WorkCircle, where the team all work from their own homes or offices, it makes coordination, document sharing and collaboration a lot easier.
The approach is growing in popularity, and Google, Microsoft and Amazon are among the many large companies working on ways to attract users to their offerings, with Google Apps, Microsoft’s Live Mesh and Amazon S3 all signing up customers as they try to figure out what works and what can turn a profit.
The technical obstacles to making distributed systems work are formidable, and while books like Nick Carr’s ‘The Big Switch’ talk optimistically about the potential for utility computing to be offered to homes and businesses just like electric power, building robust, reliable and scalable systems around these new models will tax our ingenuity.
As we become more reliant on the cloud any problems will become more severe, as we can see in the irritation that many users feel with Twitter at the moment because of constant outages, dropped messages and general flakiness as the company tries to cope with what was clearly an unanticipated growth in usage.
It would be a lot worse if your spreadsheets or presentations were inaccessible because of problems in the cloud, or rather because of problems with the physical computers or network connections that make cloud computing possible.
Because behind all the rhetoric and promotional guff the ‘cloud’ is no such thing: every piece of data is stored on a physical hard drive or in solid state memory, every instruction is processed by a physical computer and the every network interaction connects two locations in the real world.
It is often useful to conceptualise online activities as ‘cyberspace’, the place behind the screen, but the internet is firmly of the real world, and that is one of the greatest problems facing cloud computing today.
In the real world national borders, commercial rivalries and political imperatives all come into play, turning the cloud into a miasma as heavy with menace as the fog over the Grimpen Mire that concealed the Hound of the Baskervilles in Arthur Conan Doyle’s story.
The issue was recently highlighted by reports that the Canadian government has a policy of not allowing public sector IT projects to use US-based hosting services because of concerns over data protection.
Under the USA PATRIOT Act the FBI and other agencies can demand to see content stored on any computer, even if it being hosted on behalf of another sovereign state. If your data hosting company gets a National Security Letter then not only do they have to hand over the information, they are forbidden from telling you or anyone else—apart from their lawyer—about it.
The Canadians are rather concerned about this, and rightly so. According to the US-based Electronic Frontier Foundation, a civil liberties group that helped the Internet Archive successfully challenge an NSL, over 200,000 were issued between 2003 and 2006, and the chances are that Google, Microsoft and Amazon were on the recipient list.
Even encrypting the data stored in data centres won’t always work, as one of the benefits of Amazon’s S3 and other services is that they do remote processing too, and the data needs to be decrypted before that can happen.
This is not just a US issue, of course, although attention has focused on the US because that it where most of the ‘cloud’ data centres can be found. It applies just as much to the UK, where the Regulation of Investigatory Powers Act will allow the police or secret services to demand access to databases and servers. And other countries may lack even the thin veneer of democratic oversight that the USA and UK offer to the surveillance activities of their intelligence agencies.
Companies have no real choice but to comply with the law in countries where they operate, and I don’t expect a campaign of civil disobedience from the big hosting providers. Those of us who use the cloud just need to be clear about the realities of the situation—and not send or store anything on GoogleMail or HotMail that the US government might want to use against us.
Part of the attraction of the internet was always that it transcended geographic boundaries of all forms, whether political or physical. Communities grew because people shared interests or values, not because they lived in the same place or were under the same government. It was far from perfect, but it gave us a glimpse of a better world.
The push towards cloud computing may force us to be more realistic about the boundaries that have always existed. Perhaps it is time for the UN to consider a ‘cyberspace rights treaty’ that will outline what it’s acceptable to do when other people’s data comes into your jurisdiction.
Sponsored byIPv4.Global
Sponsored byRadix
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byCSC