Why ‘sexist’ Apple Card is Cupertino’s worst nightmare

By

Sexist Apple Card accusations are tech's latest example of biased algorithms
Apple Card was meant to be the solution, not part of the problem.
Photo: Apple

Cupertino pitched Apple Card as the greatest credit card in history. Instead, the card generated negative PR based on accusations that the algorithm used to decide credit limits is discriminatory.

Even Apple co-founder Steve Wozniak complained about the supposedly sexist algorithm. Woz said he received 10 times more credit than his wife, despite sharing bank accounts and assets. Here’s how Apple became the latest tech giant to be accused of algorithmic bias — and what that means.

Problems with Apple Card algorithm

Entrepreneur David Heinemeier Hansson first raised a red flag about the Apple Card algorithm on Twitter last week. He said he received 20 times the credit limit of his wife, despite the fact that they file joint tax returns.

“I was deeply annoyed to be told by Apple Card representatives, ‘It’s just the algorithm,’ and ‘It’s just your credit score,’ Jamie Heinemeier Hansson wrote in an article for Fast Company. “I have had credit in the U.S. far longer than David. I have never had a single late payment. I do not have any debts. David and I share all financial accounts, and my very good credit score is higher than David’s.”

Heinemeier Hansson’s complaint caused New York’s Department of Financial Services to open an inquiry into Apple Card issuer Goldman Sachs’ practices. Laws prevent algorithms from determining treatment based on things like age, creed, race, color, sex, sexual orientation and other factors.

We live in a world in which algorithms weigh heavily on our everyday experiences. They affect everything from the news we see to our eligibility for home loans to, yes, our credit limits.

One of the big promises of the algorithmization of the world was that, among other things, it would eliminate bias. A human manager might be more likely to hire people who look like them. But an algorithm tasked with a similar job ideally would judge candidates objectively. It’s a beautiful dream of meritocracy.

Algorithmic bias

And yet possible Apple Card gender discrimination shines a light on the subject of algorithmic bias. Of course, however much we like to view our technology as magic, it isn’t. Human coders program algorithms. If anything, things get even more complicated when black-boxed neural networks are involved. These AI tools are inscrutable to human programmers, who have access to only the input and the output. The computer figures out the messy middle part, turning one into the other.

But depending on the data it is given, bias can exist there. In some cases, that could be explicit prejudice on the part of the coder. It also could simply be a blind spot on the part of the person coding it. That’s what happened in the 2015 Google photo app controversy, in which two black people were classified as gorillas by the image recognition system. (The algorithm hadn’t been sufficiently trained using pictures of black people.)

AI also can embed societal prejudice in algorithmic form, by predicting the future based on a flawed past. An AI judging a beauty contest by looking at past winners is likely to reinforce previous standards of beauty (i.e. white people), which it then turns into forward-facing decisions.

Goldman Sachs responds

It’s not yet clear what the exact problem is with the algorithm used to determine Apple Card credit limits. It’s quite possible that the algorithm may have been developed by Goldman Sachs, Apple’s financial partner in Apple Card.

“We have not and never will make decisions based on factors like gender,” said Carey Halio, Goldman Sachs’ retail bank CEO, in a statement responding to the uproar. “In fact, we do not know your gender or marital status during the Apple Card application process.” However, the bank says it will be happy to review Apple Card applications again based on updated information.

Bad optics for Apple card

No matter the cause, this is bad optics for Apple. Speaking to CNBC, Heinemeier Hansson put the focus squarely on Cupertino.

“I don’t feel like I’m a customer of Goldman Sachs,” he said. “I feel like I’m a customer of Apple.”

While the small print may technically present the consumer-friendly Apple Card as a collaboration between Goldman Sachs and Cupertino, Apple is the face of this controversy. It’s called Apple Card, after all.

“Do I go to Foxconn if I have an issue with my iPhone?” Heinemeier Hansson said. “Of course I don’t. I go to Apple. This is an Apple product, and Apple owns this fully.”

Can Apple Card algorithm be fixed?

Can Apple (and Goldman Sachs) fix this problem? On a technical level, certainly. In terms of its dominance of the news cycle? Of course. But the claims of gender discrimination don’t look good. And the controversy is indicative of the kind of challenges Apple could face as it moves more heavily into the financial services world.

It’s great that Apple Card makes it easier for people to manage their money. But offering financial services means denying them to some people, too. That’s already got potential to cause negative PR for Apple even if the algorithms work perfectly. When they don’t, as seems to be the case here, it proves even more problematic.

Apple’s AI challenge

Apple has long been the “good” computer company. It makes products that are transparent in how they work. Its current CEO, Tim Cook, has talked about Apple being a “force for good” in the world. Apple consistently fights for social issues, including battling gender discrimination. There’s no company we would accept potentially biased algorithms from, but it’s especially troubling such allegations about Apple’s credit card.

Today, Apple is pushing into fields like artificial intelligence more forcefully than ever. Having lagged in this area for years, things like deep learning neural nets play a growing role in Apple technology.

Apple dedicates much time and effort to making sure people trust these tools. The company promotes privacy and accountability even as it hypes its latest hardware and software offerings.

This Apple Card algorithm debacle highlights how these efforts can backfire, even for a thoroughly progressive company like Apple. Hopefully all involved can learn a valuable lesson from this situation.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.