Archives For cloud computing

server connections

One of the most frequently invoked caricatures about computer illiteracy involves some enraged senior citizen demanding that something he finds offensive or objectionable is deleted from the internet because we all know that once something is out on the web, it’s out there until there are no more humans left anywhere. This is actually kind of cool. We’re a civilization that’s leaving a detailed, minute by minute account of who we are, what we did, and how we did it, mistakes and flaws included, in real time, and barring some calamity, hundreds of years from now, there could well be a real web archaeologist looking at your Facebook profile as part of a study. But that’s also kind of scary to EU bureaocrats so they’re arguing for a kind of right to forget for the web, a delete by date for every piece of content out there. This way, if you say or do something stupid when you’re young, it won’t come back to bite you in your future career or social interactions. It seems like a good, and very helpful idea. Too bad it’s pretty much technically impossible.

Sure, you or someone could delete a certain file on cue from a server. But the web isn’t ran on just one server and all major sites nowadays run in a cloud, which means that their data leads a nomadic life and had been replicated hundreds if not thousands of times over, and not only for caching and backups, but also for the purposes of anycasting. Without anycasting, getting your data from the cloud could be a miserable experience because if you’re in LA and the server that hosts your data is in say, Sydney, there’s going to be a lot of latency as it’s traveling through an underwater fiber pipe thousand of miles long. But if the closest data center is in Palo Alto, there will be a lot less territory for the data to cover and you’ll get your data much faster. Though this means that the same compromising picture, or post, or e-mail is living in both data centers. And on their backups. And in their caches. Oh, and all the other "edge servers" in all the other data centers used by the website’s cloud, directly or through third party arrangements.

Additionally, marking each piece of data with a self-destruct feature is very problematic. If data can be marked for deletion, it could easily be un-marked, and knowing that all data now has its use-by timestamp will mean a lot of very painful and expensive changes for the databases and the data centers expected to support this functionality. Putting a price tag of a few billion dollars on this sort of rewiring is probably very optimistic, and even then, it’s a certainty that a hacker could disable the self-destruct mechanism and keep your data forever. Likewise, what if you do want to keep a certain picture or e-mail forever for its sentimental value and lose track of it? Will you still be able to stumble on it years later and relive the precious moment? Yes, embarrassing stuff on the web for the foreseeable future and beyond is a big deal, but there is a purely non- technical solution to it. Think twice before posting, and understand that everybody has done an embarrassing thing or two hundred in the past, and will continue to do them in the future.

In five to ten years, we would’ve been living online for roughly two decades and seen generation after generation enmesh themselves into social media with mixed results. Barring something far too alarming to ignore, like current proud and vocal bigotry, someone’s past missteps shouldn’t be held against them. We’ll eventually forget that the pictures or posts or e-mails are even there and when we unearth them again, we’ll be dealing with a totally different person more often than not, so we can laugh them off as old mistakes not worth rehashing because that’s exactly what they are. The current legal tooth-gnashing about the eternal life of digital information is coming up because this is all new to the middle aged lawyers and senior judges who have been used to being able to hide and forget their youthful indiscretions and being unable to find out anything of potential shock value about someone’s past without digging for it on purpose. Generations used to a life in public are almost bound to have a very different, much more forgiving view.

Share

digital cloud

Good stories need conflict, and if you’re going to have conflict, you need a villain. But you don’t always get the right villain in the process, as we can see with the NYT’s scathing article on waste in giant data centers which form the backbone of cloud computing. According to the article, data centers waste between 88% and 94% of all the electricity they consume for idle servers. When they’re going through enough electricity to power a medium sized town, that adds up to a lot of wasted energy, and diesel backups generate quite a bit of pollution on top of that. Much of this article focuses on portraying data centers as lumbering, risk averse giants who either refuse to innovate out of fear alone and have no incentive to reduce their wasteful habits. The real issue, the fact that their end users demand 99.999% uptime and will tear their heads off if their servers are down for any reason at any time, especially during a random traffic surge, is glossed over in just a few brief paragraphs despite being the key to why data centers are so overbuilt.

Here’s a practical example. This blog is hosted by MediaTemple and has recently been using a cloud service to improve performance. Over the last few years, it’s been down five or six times, primarily because database servers went offline or crashed. During those five or six times, this blog was unreachable by readers and its feed was present only in the cache of the syndication company, a cache that refreshes on a fairly frequent basis. This means fewer views because for all intents and purposes, the links leading to Weird Things are now dead. Fewer views means a smaller payout at the end of the month, and when this was a chunk of my income necessary for paying the bills, it was unpleasant to take the hit. Imagine what would’ve happened if right as my latest post got serious momentum on news aggregator sites (once I had a post make the front pages of both Reddit and StumbleUpon and got 25,000 views in two hours), the site went down due to another server error? A major and lucrative spike would’ve been dead in its tracks.

Now, keep in mind that Weird Things is a small site that’s doing between 40,000 to 60,000 or so views per month. What about a site that gets 3 million hits a month? Or 30 million? Or how about the massive news aggregators dealing with hundreds of millions of views in the same time frame and for which being down for an hour means tens of thousands of dollars in lost revenue? Data centers are supposed to be Atlases holding up the world of on-demand internet in a broadband era and if they can’t handle the load, they’ll be dead in the water. So what if they wasted 90% of all the energy they consumed? The clients are happy and the income stream continues. They’ll win no awards for turning off a server and taking a minute or two to boot it back up and starting all the instances of the applications it needs to run. Of course each instance takes only a small amount of memory and processing capability even on a heavily used server, so there’s always a viable option of virtualizing servers on a single box to utilize more of the server’s hardware.

If you were to go by the NYT article, you’d think that data centers are avoiding this, but they’re actually trying to virtualize more and more servers. The problem is that virtualization on a scale like this isn’t an easy thing to implement and there’s a number of technical issues that any data center will need to address before going into it full tilt. Considering that each center uses what a professor of mine used to call "their secret sauce," it will need to make sure that any extensive virtualization schemes it wants to deploy won’t interfere with their secret sauce recipe. When we talk about changing how thousands of servers work, we have to accept that it takes a while for a major update like that to be tested and deployed. Is there an element of fear there? Yes. But do you really expect there not to be any when the standards to which these data centers are held are so high? That 99.999% uptime figure allows for 8 hours and 45 minutes of total downtime in an entire year, and a small glitch here or there can easily get the data center to fail the service contract requirements. So while they virtualize, they’re keeping their eye on the money.

But the silver lining here is that once virtualization in data centers becomes the norm, we will be set for a very long period of time in terms of data infrastructure. Very few, if any, additional major data centers will need to be built, and users can continue to send huge files across the web at will just as they do today. If you want to blame anyone for the energy waste in data centers, you have to point the finger squarely at consumers with extremely high demands. They’re the ones for whom these centers are built and they’re the ones who will bankrupt a data center should an outage major enough to affect their end of month metrics happen. This, by the way, includes us, the typical internet users as well. Our e-mails, documents, videos, IM transcripts, and backups in case our computers break or get stolen all have to be housed somewhere and all these wasteful data centers is where they end up. After all, the cloud really is just huge clusters of hard drives filled to the brim with stuff we may well have forgotten by now alongside the e-mails we read last night and the Facebook posts we made last week…

Share

If we had all the things science fiction movies promised us we’d have by now, we’d be zooming to a four hour work day in flying cars, taking routine vacations to the Moon and Mars, installing artificial organs on a whim to rebuild our bodies into superhuman forms, and have constant, on-demand access to a suite of tools that will let us do anything from booking those flights to downloading entire libraries worth of books. Ok, so one out of is not that bad and we’re certainly enjoying the digital cloud thanks to our new generation of phones which are always connected to the web and can tell us where we are, where we need to go, and if we have e-mails we’d really need to read when we get a moment. But there has to be a way to get rid of that bulky phone thing we’re carrying around to do that, something a little more futuristic and keeping us in touch with everything that goes on without us having to stop and heck a device. Google certainly thinks so, which is why it’s working on a little augmented reality project called Google Glass which will basically put your smartphone into a pair of glasses which will probably make you look like a hipster until all the required technology really shrinks down…

So, in the words of someone who asked me about the feasibility of such a project, can this be a thing? Yes, it definitely can and it looks awesome as a concept, straight out of a Kurzweilian cyber-utopian fantasy. We have all of the technology to make it happen so all we need to do is put it together and make it look good enough to buy and easy enough to use. Unfortunately, I have trouble believing it would work nearly as smoothly as you’re seeing in the video without an interface with your brain. Since you probably don’t want to be wearing a band of electrodes around your head on a daily basis or have a microchip implanted into your skull, you’d have to use your eyes or voice commands to control it which will have the frustrating tendency to make the device do a lot of things it shouldn’t be doing at the moment. For example, if the movement of your eyes or blinking will scroll through your options or select a prompt, what happens if you just blink because that’s what your body does or look in the direction of the prompt to check out something that catches your interest? And voice commands on busy streets may not be the best for usability. The algorithm which will parse your command to figure out what you’re saying will have to struggle against a lot of background noise and consumer voice recognition systems aren’t really all that great in the first place which makes them a challenge to use even when it’s quiet.

When Siri was added to iPhones, I could hear growls of frustration coming from iPhone users in my office as they tried to get their devices to recognize fairly simple commands with cries like "weather, weather you idiot!" and "no, not Denver? how the hell did you get Denver?" True, it would get better with time but its going to be a very long road to get it work work as well as the human brain at recognizing natural speech and while that will be happening, you probably don’t want to be walking down the street sounding like you’re in an argument with yourself, especially without the obnoxious little Bluetooth ear bud that makes people wonder if you’re just on a call with your boss or just talking to the voices in your head. The fellow in the video seems to be rather happily chatting with his glasses but consider how he would look to others when suddenly asking where his friend is to no one in particular. Wouldn’t it be rather socially awkward to be in a coffee shop in which people talk into a void while drinking their coffee? It’s already awkward when they’re chatting away on their phones and our still evolving brain is trying to make out the other end of the conversation before simply ignoring it. There’s also the creepy factor of seeing people ten years into the future interfacing with their glasses via some sort of nanobot structure embedded into their white matter, sitting in silence, staring seemingly into space, but really working on a report for work or browsing e-mails from their friends. And yes, that’s also very technically plausible…

Share