Could something in humans reproduce uniquely human behavior? Or, more specifically, could an algorithm be sexist?
The Microsoft Researcher Tarleton Gillespie, one of the great thinkers of the area, argues so: because it is a human creation, any algorithm would bring in its standards and selections a reflection of the characteristics of human beings both good and bad. And the Apple may be the latest company to reproduce this notion.
Explain: In recent days, a polemic has formed around the credit algorithm of the Apple card For those who don't know, Ma's credit card has an almost automatic setup process, where the user only has to provide some data and have their credit limit displayed almost instantly by Apple (and the bank). Goldman sachs). It all started when the developer David Heinemeier Hansoncreator of framework Ruby on Rails, accused the card algorithm of being sexist:
Hanson said the problem goes deeper, noting that, unlike him, his wife's Apple Card does not allow new purchases to be made until the next billing period, even if she has already paid the latest bill. The card's customer service was also criticized by the developer, who complained about the fact that agents are not allowed to discuss the credit release process and cannot provide evidence for that decision.
Hanson and his wife even paid $ 25 to check the Score both in the US banking system and, to their surprise (and Apple, which was communicating with the couple), her score was bigger that of the husband. That is, there would be no plausible explanation for the phenomenon that is not a biased tendency embedded in the algorithm. As stated by the developer:
Following Hanson's report, several other users claimed to have suffered similar problems. One of them was none other than Steve Wozniak, co-founder of Apple:
The repercussion of the story and the amount of reports published in the social media was sufficient to draw the attention of the authorities. In a post on Medium, the Superintendent of the New York State Department of Financial Services, Beautiful lacewell, said he would investigate the case with his team and seek more information about the algorithm from Apple and Goldman Sachs.
The server further recalled that New York State law probes discrimination against any protected classes of individuals and therefore any method for determining credit limits including algorithms cannot include differential treatment for people based on age, religion, race. , color, gender, sexual orientation, nationality or other characteristics. The same law applies to various other US states and territories in the world.
Apple did not comment on the case, but Goldman Sachs went to Twitter to give its version of the facts. The bank statement reads as follows:
With the Apple Card, your individual account, as well as your credit line; You establish your own direct credit history. Customers do not share credit lines under the account of a family member or other person with additional cards.
Like any other individual credit card, your application is evaluated independently. We analyze each individual's income and credibility, which includes factors such as Score From personal credit, the debts you have and how those debts are managed. Based on these factors, it is possible for two people in the same family to receive significantly different credit decisions.
In all cases, we do not and will not make decisions based on factors such as gender.
In any case, the New York Department's investigation is still ongoing and we should hear more from you soon. What do you guys think?