It’s pretty common knowledge these days that every time any one of us participates in Social Media (and “likes,” “retweets” or “posts,”) and whenever we buy something online, log onto an app or view an ad on our phones or computer, that information is collected, stored—and used. The music you download gives a strong clue about your age. Living in certain census tracts hints at your racial or ethnic heritage
The collective data about you and everyone else is processed using algorithms—a fancy term for computer programs that look for patterns in the data. A simple example is: if you buy a few romance novels at Amazon, Amazon will recommend other romance novels. The algorithm draws not only on your reading preference, but also the most best-selling books within your reading preference, and perhaps also the type of romance novels you’ve purchased.
A more complex example is when you click on an article about anything, and suddenly you see advertisements on your browser pages about whatever you clicked on, and the ads follow you around. When Facebook sold user data to Cambridge Analytics, the firm was able to determine not just voter preferences, but also the issues that would turn on or off the voter’s enthusiasm for voting for a particular candidate, and target advertisements to individuals that would raise or lower those enthusiasm levels.
The point here is that algorithms play a huge role not only in helping us make decisions, but in influencing those decisions—and some people are alarmed about the potential for misuse. They have good reason. Facebook famously blocked many Native Americans from signing up for accounts because the software thought their names—including Lance Browneyes and Dana Lone Hill—were fake. An Amazon artificial intelligence algorithm that was designed to screen job applicants managed to teach itself to weed out women by looking for certain keywords on their resumes. Researchers recently found that job-finding searches are less likely to refer opportunities for high-paying positions to women? Why? Because these job seekers don’t match the typical profile of people already holding those jobs—mostly white men.
Algorithmic lending systems have tended to charge higher interest rates to Latin and African-American borrowers, not because they can detect the color of the borrower’s skin, but because of ethnic purchasing patterns. Those demographic cohorts have been, in the past, less likely to shop online than their white anglo peers....