when computing gets a little too personal…

DARPA wants a program that can monitor your actions and predict your every move. Is that even remotely plausible?
control escape humor

In what will sound like an old Yakov Smirnoff joke, DARPA wants your computer to watch you. Literally. Every move you make and everything you do with data will be monitored, recorded, then analyzed and dissected so the computer can use your behavior for authentication. Don’t feel quite like yourself one day? You can’t use the machine since your irregular behavior locks you out of the system. Pretty nifty right? Just one question though. How exactly is this going to work with any degree of accuracy? Humans are by their nature hard to predict with a significant degree of accuracy and recording their typical habits in no way leads to an individualized, secure authentication model because different people can have very similar habits and any system based on a basic statistical analysis of human patterns has to allow for a certain degree of variation, otherwise a curt reply on a rough day could trigger an account lockout. This idea seems to follow the new DARPA pattern of collecting an enormous amount of data, then using it to predict complex soft metrics, something that we really can’t do.

My guess is that the different factors like eye scans and commonly used words will be used as neurons in an artificial neural network. Then, when a person whose habits were monitored interacts with the system, all the interactions will be used as inputs and ran through the network to determine whether they fit into the patterns on an ongoing basis. What happens if you’re just having an off day? Trouble at home? Maybe you got a ticket on your way to work? Maybe the traffic was particularly horrendous that day and you’re still seething over some random twit who decided to cut through two lanes of freeway traffic doing 25 under the speed limit right under your nose, almost making you read end him? Well, the neural network probably won’t like that and interfere as you try to work, threatening to turn your simmering anger into full blown fury and triggering another system that observes your outward behavior to target you as a security risk. Just watching what you tend to type often, how much you type, where your eyes are directed, and so on and so forth, aren’t very good indicators of how you’re going to behave in the future and they don’t capture anything all that highly individual, especially if you’re doing fairly repetitive tasks on a computer as part of your daily routine. It’s just data for a massive data dump.

And this is not to mention that judging your every interaction with a computer on even an hour by hour basis is going to take an immense amount of computing power since your actions have to be recorded and sent to an equivalent of a small server far to be constantly ran through the neural network. Since we’re talking about the mad science arm of the military, this system’s target use would be the Defense Department which will have millions of uniformed service members, employees, and contractors, to keep track of. The system would have to analyze billions of actions every day nonstop. It’s not impossible, but it would be very expensive to maintain and as we’ve just discussed, the results it would provide are rather dubious at best. It may be tempting to see the patterns of data it will generate as being extremely informative and revealing, but they’re not. Since it has to deal with humans, anything other than a major anomaly within the entire system will get lost in the noise, and seemingly personally identifiable computer usage habits will be homogenized into something so generic, it’s going to apply to an entire subset of computer users rather than just one. Any other approach and the network could be constantly going off with false alarms and thousands of people will be locked out on a regular basis after an occasional sneeze sets off the system’s hair trigger, which DARPA would find unacceptable.

We need to remember that with today’s computers we can easily record enormous amounts of data and then crunch through it faster than an entire army of humans. But that doesn’t mean that the data we can collect has to yield some profound insight we can tease out for predictive purposes. We have to focus not on what we can measure but why we’re measuring it and what factors are involved. We can mine oceans of data for some big and surprising factoids like say, 88% of users don’t use a feature the site owners thought would be huge. And that’s really it. From this data, we can’t predict that tweaking the feature in certain ways will ensure that it could attract three times the users utilizing it at the time the data was reviewed because personal preference varies greatly and something new and completely out of range for your data collection capability really captures your users’ time and attention. And sure, we can use certain data to help solve very straightforward prognostication problems, but only when they involve few factors, are very narrowly defined, and based on solid data points we can express as hard facts such as numbers, text, or true/false values. Beyond that, we’re engaging in what is really more or less just informed speculation that could be spot on, or a textbook example of GIGO.

# tech // cybersecurity / data mining / security

  Show Comments