If you live in the U.S. and still watch certain TV channels, you may be forgiven for thinking that if you don’t know your FICO score, or lack apps and services to notify you of every slight change within a moment, you may as well give up on actually owning or renting anything without having a massive pile of cash sitting in a bank. Cutting through the commercial hyperbole, there’s a bit of truth to that in a country where borrowing is high and saving is low. Lenders need to have an objective and quick way to figure out how likely you are to repay them, and one company called Fair Isaac has long claimed it owns an equation to predict exactly that based on your history of making timely payments and other factors that seem important. The end result is quick, a three digit number that seems to speak volumes. But is it objective in an age where getting laid off as automation or outsourcing claim one’s job, or a dire medical problem can instantly land you in a world of financial pain and ruin? Probably not. No matter how you look at it, the FICO score has some pretty significant shortcomings, but fixing them could actually get really, really ugly…
For a few years, credit rating agencies have been toying with the idea of using social media as an additional barometer of your creditworthiness, particularly Facebook and LinkedIn, trying to find a correlation between your online contacts and odds of a default. In some cases, you can make fairly accurate predictions. A senior manager at a very large corporation whose contacts on professional social networks are all high powered business people, with a resume full of big numbers and grand accomplishments is probably not going to stop paying for his new BMW or buy a new house and skip town. But what about a hardworking college student with a couple of stoner friends who never amounted too much still listed in her Facebook contacts? You may as well flip a coin because if you’re deciding the worth of a person only by the company he or she keeps, not only does it open the door to discrimination, but removes that applicant’s agency by holding friends’ failures real or imagined, over this person’s head. Yes, this student may default and fall behind. But she could also be determined to build up a great credit score no matter the personal cost and pay in full, on time, every time, while working her way to adulthood.
Now, as scary as the attempts to base your credit rating on that of your friends sound, they got nothing on China’s grand plan to develop a social score for its citizens that goes far beyond the humble creditworthiness rating and all the way into meddling in their personal lives and political beliefs. Not only do you need to have a great history of on time payments to qualify for loans or ownership of private property, but you must also demonstrate yourself a productive citizen who is loyal to the party. Buying video games penalizes you while buying diapers rewards you. Your friends started posting sarcastic, Soviet-style jokes about the Communist Party? Well, you really didn’t want to buy a new house or get a new car, did you now? Oh, you did? Too bad. Probably shouldn’t be friends with unpatriotic dissidents then. You can see where this is going. Imagine a similar score in the U.S. used by the NSA and FBI to assign one’s likelihood of becoming some sort of criminal or terrorist, their less than airtight statistical models used to justify searches and seizures of random individuals whose personal choices and behavior matter less and less than the choices and behaviors of their social group. It’s like dystopian a sci-fi tale coming to life.
Really, there’s a limit to how much data we should be collecting and using, and allowing people to opt out of collection processes they think can be abused. Maybe a credit rating agency does want to create a financial product for people who want to use their friends to vouch for them. It would be their choice to see how it pans out. But if it’s using the same kind of research on new line of credit applicants who have not consented to this process, it needs to be heavily punished so that violating the rules costs much more than just complying with them. Just because we are fully capable of quickly and easily creating the tools for an Orwellian society doesn’t mean that we have to enable tyranny by algorithm and pretend that because computers are making some decision based on data they’re collecting it’s all objective and above board. People program all of these sites, people collect and organize this information, and people write the algorithms that will crunch it and render a verdict. And people are often biased and hypocritically judgmental. If we let their biases hide in lines of code watching out every move and encouraging us to be little model citizens, like the Chinese plan does, the consequences will be extremely dire.