- Possible gender discrimination in Apple Card
Did you read about that story, that went viral on Twitter, about Apple credit cards offering a much higher credit limit to men than women, even when the latter have demonstrably the same or even better financial situations? [If not, read this or this]
The person who posted the original story, tech entrepreneur David Heinemeier Hansson, said that Appled had given him a credit limit 20x higher than his wife. Other people joined the conversation, sharing similar stories. Steve Wozniak, for instance, (yes, Apple’s co-founder!) wrote that he had been offered 10x the credit offered to his wife, with whom he shared all accounts and assets.
I had a similar experience many years ago. Not with credit cards, but with mobile phone upgrades. I had been in the UK for longer than my husband; we had shared bank accounts and assets; and I had a higher salary than him. We also had the same type of phone contract. Yet, when the time came to get a phone upgrade, he was offered one, but I wasn’t. That was 17 or so years ago. It is so, so depressing that we are still facing the same stupid bias against women. I am so tired of this… 😩
[side note: this situation (and others like this), inspired my PhD topic: how do firms profile ‘undesirable’ customers? Eventually, the topic was narrowed down to the case of suspected money laundering. Incidentally, last week was the 12th anniversary of my PhD viva].
2. Analysis of the fatal crash by an Uber self-driving vehicle
Six (!) months ago, Wired published an article reporting on some of the key findings from the investigation, by The National Transportation Safety Board, of the fatal crash of one of Uber’s self-driving vehicles. So, I was a bit late to the party, when I finally read the article. (Ah the joys of cleaning your e-mail inbox). Still, it is a really interesting article, very much worth a read. Here is snippet:
“The most glaring mistakes were software-related. Uber’s system was not equipped to identify or deal with pedestrians walking outside of a crosswalk. Uber engineers also appear to have been so worried about false alarms that they built in an automated one-second delay between a crash detection and action. (…)
Much of that explains why, despite the fact that the car detected Herzberg with more than enough time to stop, it was traveling at 43.5 mph when it struck her and threw her 75 feet. When the car first detected her presence, 5.6 seconds before impact, it classified her as a vehicle. Then it changed its mind to “other,” then to vehicle again, back to “other,” then to bicycle, then to “other” again, and finally back to bicycle.
It never guessed Herzberg was on foot for a simple, galling reason: Uber didn’t tell its car to look for pedestrians outside of crosswalks. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB’s Vehicle Automation Report reads. Every time it tried a new guess, it restarted the process of predicting where the mysterious object—Herzberg—was headed. It wasn’t until 1.2 seconds before the impact that the system recognized that the SUV was going to hit Herzberg, that it couldn’t steer around her, and that it needed to slam on the brakes.”

A few thoughts came to my mind, when reading this article.
First, there wasn’t a fault in the sensors, or the brakes, or any other hardware elements. The fault was entirely with the software: both the algorithm that was decoding the input data from the road; and the algorithm that dictated the subsequent actions.
Second, the algorithm that was decoding the input from the road – or, rather, the programmer(s) that oversaw its development – could not fathom that humans might use the technical infrastructure in ways other than what they were designed to do. For an industry / profession that prides itself in being imaginative and, even, visionary, there seems to be a basic lack of creativity about how people might use the infrastructure around them, from crosswalks to live streaming or 3D printing.
Third, even though the algorithm wasn’t sure of the nature of the obstacle ahead (Vehicle? Bicycle? Other?), it had detected an obstacle! So, why didn’t it stop? Or slow down dramatically? Aren’t these things supposed to keep the driver safe? Surely, continuing to drive towards an identified obstacle can’t be safe.
3. Wanted – Professors of Foresight! Still
Mind you, the lack of imagination exhibited by Uber’s algorithm developers (story above) is not a new phenomenon. H G Wells, author of the novella ‘The Time Machine” wrote an essay in 1932, entitled “Wanted – Professors of Foresight!”, in which he observed that:
“It seems an odd thing to me that though we have thousands and thousands of professors and hundreds of thousands of students of history working upon the records of the past, there is not a single person anywhere who makes a whole-time job of estimating the future consequences of new inventions and new devices. There is not a single Professor of Foresight in the world. But why shouldn’t there be? All these new things, these new inventions and new powers, come crowding along; every one is fraught with consequences, and yet it is only after something has hit us hard that we set about dealing with it.
See how unprepared our world was for the motor car. The motor car ought to have been anticipated at the beginning of the century. It was bound to come. It was bound to be cheapened and made abundant. It was bound to change our roads, take passenger and goods traffic from the railways, alter the distribution of our population, congest our towns with traffic. It was bound to make it possible for a man to commit a robbery or murder in Devonshire overnight and breakfast in London or Birmingham. Did we do anything to work out any of these consequences of the motor car before they came?”

5 thoughts on “[Miscellany] Gender bias; lack of imagination in algorithms’ design; wanted Professor of Foresight”