taking a peek into the minds of evildoers
WeCU Technologies thinks it can read the minds of would-be terrorists , and they want to test their approach at airports. Unfortunately, their approach is unlikely to work.
A company called WeCU Technologies wants to read what’s on your mind. Literally. But don’t worry, they’re not interested in your opinions on political hot button topics or your personal information. They want to know if you might be a terrorist or not by flashing symbols and code words associated with terrorist groups and seeing if you react. Should you show any sign of anxiety, like averting your eyes, a quick elevation of your heart rate or a few other subtle cues that you might be nervous, you’ll be pulled aside for additional screening.
Don’t you love it when popular science is extrapolated into a cutting edge security and defense project, especially when it’s a potentially serious undertaking into hacking the brains of would-be evildoers and stopping them before they’re able to carry out their sinister plots of terror and mayhem? But the big question is whether it will actually work.
The short answer? No. While it’s true that certain areas of the brain light up when people recognize a symbol, knowing that the symbol was understood as familiar would require a full blown active MRI scan, something a security line at the airport isn’t going to have anytime in the foreseeable future. The other problem is that even using unrealistically advanced methods of brain imaging to tell whether a symbol was recognized is the lack of ability to tell why it was recognized and how it was processed. It’s not like the brain has a “went to a terrorist camp and told to look for this symbol to carry out a bombing” center.
Today, anyone surfing the web could be exposed to symbols and codes used by terrorist groups. Ok, so why would they be worried when it flashes on a monitor at a security checkpoint? There’s a myriad of reasons why someone could be nervous when going through an airport that have nothing to do with terrorism like having a bad day at the office, seeing an ex in the line, getting terrible news as you’re ready to get on your plane, or just being nervous about a relative who’s on the verge of death, which is why you’re flying in the first place.
We’re talking about a system that could generate so many false positives, you’re going to get the same results by randomly screaming about terrorists into a crowd of travelers and trying to pick out who looks alarmed for a further interrogation. In fact, that’s exactly what you’re going to do by flashing terrorist codes at checkpoints.
Oh, and by doing that, you would probably just prompt real terrorists to train not to show any visible emotion which could be interpreted as anxiety. Then you’ll end up with nothing more than a subjective opinion about who may look dangerous instead of a good way to get an idea of who might be up to something. Oh and you can sell it for tens of thousands of dollars, touting its promise in preventing security breaches and terrorist attacks. And that might very well be the point for WeCU. Let’s just hope the TSA doesn’t fall for their sales pitch and knows to make sure they’re not being sold a line of pop psych with a hefty dollop of exaggeration…