Apple banked on Goldman Sachs and guaranteed its customers a different credit card experience that apparently comes with a sexist bias towards women.
Apple products are white, stylish and cutting-edge. And now we know that one of them discriminates on the basis of gender. A sex controversy that’s taken the US by storm in the last few days has seen many Apple Card users including well-known tech entrepreneur David Heinemeier Hansson and Apple co-founder Steve Wozniak come out and ask:
Is the new Credit Card, created by Apple and not a Bank, sexist towards women?
It all came to light after Hansson’s tweet went viral after he questioned the fact that his Apple card’s credit limit was 20X higher than his wife’s. This is anomalous as Hansson stated that his wife has a higher credit score and they both file joint tax returns. Since then, the New York State Department of Financial Services (NYSDFS) has taken cognizance of the matter. Investigations are underway against the card issuer i.e. Goldman Sachs.
Apple’s Card is managed by Goldman Sachs
On the twitter thread, Hansson, creator of the programming tool Ruby on Rails, said that Apple’s ‘over-reliance on a “biased” algorithm’ doesn’t exempt the company of ‘discriminatory treatment’. The Woz also noted 10X higher credit than his wife. Apple released the fragile titanium-made card earlier in 2019. It is managed by Goldman Sachs. This means that Apple doesn’t manage the card but mostly only provides the brand name.
The NYSDFS is probing if the issue is in potential violation of state laws banning sex discrimination and if a discriminatory algorithm is responsible for the same. However, Goldman Sachs CEO Carey Halio has denied any wrongdoing declaring, “we have not and will not make decisions based on factors like gender.” Halio also said that Goldman Sachs was ‘open’ to revising credit limits in case users think their credit history is much higher than the credit line on Apple Card.
Another Goldman Sachs spokesman Andrew Williams also tried to explain the reason why two family members “receive significantly different credit decisions” using words like ‘individual income and creditworthiness’ and ‘personal credit scores and debt levels.’ However, when major US media houses reached out to Apple, it directed them to Goldman Sachs. The tech giant has been shy on the issue for almost a week now.
Apple risked reputation banking on an algorithm not under its control
Apple has a certain reputation in the market globally. So, it knew well when putting its valuable customer experience for a certain product on the line by giving control to another organization. Now when an algorithm turned sexist towards some of its customers, Apple can’t pretend it has no responsibility. Even the Woz opined that “they (Apple) should share responsibility.”
Similarly, Hansson repeatedly accused Apple of hiding behind Goldman Sachs. Hansson also said, “I don’t feel like I’m a customer of Goldman Sachs. I feel like I’m a customer of Apple”. Likewise, NYSDFS has said that new innovations in technology shouldn’t turn discriminatory towards ‘certain’ consumers.
This is not the first such case where an algorithm may have executed potentially discriminatory behavior.
NYSDFS is also probing another case in which an algorithm belonging to the US healthcare giant United Health Group allegedly went racist. The algorithm suggested black patients substandard treatment in comparison to white counterparts.
It shares responsibility because it ‘Created’ the Card
Such instances are resulting in skepticism in consumers as more and more corporations take to algorithms. In fact, a number of times, AI-based algorithms work in an opaque manner and execute decisions on parameters that remain a mystery even to its creators.
While they may not be intentionally fed parameters like gender or race, the data being influences assumptions. So, if an algorithm gets information where some women customers pose a greater financial risk than men, it might make a sexist assumption although its not true at the macro level. Hansson said the issue “highlights how algorithms, not just people, can discriminate”.
That said, Apple gave its name to the alleged ‘sexist’ card. It also guaranteed its customers the same quality they have come to expect from the tech giants’ other products. Consequently, when an algorithm, Apple cannot shrug and say that it has no responsibility.