Collateral Damage: Landing Credit

Tucker W
2 min readApr 16, 2021

In Weapons of Math Destruction, written by Cathy O’Neil, chapter 8 talks about how AI affects the livelihood of how people’s lives through FICO scores. It is an algorithm that helps decide the type of loans a person should be allowed to get. In the beginning the AI was extremely helpful and allowed all kinds of people to be granted a loan. However, it was later that the algorithm actually did discriminate who received a loan and who didn’t. It did not do this through color, instead it was through zip codes. If someone lived in a poorer area, then their score would be associated with those in their postal code. Since poorer areas are usually not able to pay off their debts then that means it will give the individual a lower score and make it harder for them to advance it.

Having a low credit score effects many areas of someone’s life. It prevents them from getting a loan for a car or house. In extreme cases this can even be applied towards a job. Having a low credit score is sometimes associated with being untrustworthy in the employer’s eyes, which, in turn, leads to the individual not getting the job.

The only way people can prevent cases like these happening to themselves is to extremely financially responsible and never missing a payment. They can request credit reports to find and fix any identify errors, like being falsely accused of a crime that another person with the same name did.

My biggest takeaway from this chapter is that it seems, once again, that algorithms tend to unintentionally hurt many people’s livelihoods even though they were made with the intent to indiscriminately help/service people. If these algorithms were not deeply integrated into society already they should be removed and retuned to end it discriminatory tendencies. Once its fixed it should be reentered into society.

--

--