Could something that is not human reproduce exclusively human behavior? Or, more specifically, could an algorithm be sexist?
The Microsoft researcher Tarleton Gillespie, one of the great thinkers in the field, argues that yes: because it is a human creation, any algorithm would bring in its patterns and selections a reflection of the very characteristics of human beings – both good and bad. And the Apple it may be the most recent company to reproduce this notion.
I explain: in the last few days, a controversy has been formed around the credit algorithm of the Apple Card – for those who don’t know, the Apple credit card has an almost automatic configuration process, in which the user only needs to provide some data and will have his credit limit presented almost instantly by Apple (and the bank) Goldman Sachs). It all started when the developer David Heinemeier Hanson, creator of framework Ruby on Rails, accused the card algorithm of being sexist:
Hanson said the problem goes even deeper, noting that, unlike him, his wife’s Apple Card does not allow new purchases to be made until the next billing period, even though she has already paid the most recent invoice. The card’s customer service was also criticized by the developer, who complained about the fact that agents are not allowed to discuss the credit release process and cannot provide evidence for that decision.
Hanson and his wife even paid $ 25 to check the Score Credit Card of both in the American banking system – and, to the surprise of both (and Apple, who was communicating with the couple), her score was bigger than the husband’s. In other words, there would be no plausible explanation for the phenomenon other than a prejudiced trend embedded in the algorithm. As the developer stated:
After Hanson’s report, several other users claimed to have suffered similar problems. One was none other than Steve Wozniak, co-founder of Apple:
The repercussion of the story and the number of reports published on social media was enough to draw the attention of the authorities. In a post on Medium, the Superintendent of the New York State Department of Financial Services, Linda Lacewell, said he will investigate the case with his team and seek more information about the algorithm from Apple and Goldman Sachs.
The server further recalled that New York State law prohibits discrimination against any protected classes of individuals and therefore any method for determining credit limits – including algorithms – cannot include differential treatment for people based on age, religion , race, color, gender, sexual orientation, nationality or other characteristics. The same law applies to several other American states and territories in the world.
Apple did not comment on the case, but Goldman Sachs took to Twitter to give its version of the facts. The bank’s statement reads as follows:
With the Apple Card, your account is individual, as is your credit line; you establish your own direct credit history. Customers do not share lines of credit under the account of a family member or other person with additional cards.
Like any other individual credit card, your order is assessed independently. We analyze each individual’s income and credibility, which includes factors such as Score Personal Credit, the debts you have and how those debts are managed. Based on these factors, it is possible for two people in the same family to receive significantly different credit decisions.
In all cases, we do not and will not make decisions based on factors such as gender.
In any case, the investigation by the New York department is still ongoing and we should hear more about the case soon. What do you think?