Here are links to a few interesting news articles I came across recently. They are about the supposedly “unexpected”, yet highly predictable, effects of “big data” -derived algorithms on the ability of societies to exploit and abuse its members.
In case you are wondering, my recent series of link-posts are a buildup to a few upcoming inter-connected series on issues such as mechanisms behind the ongoing and inevitable demise of modern nation-states.
Link 1: Digital Star Chamber
In a recent podcast series called Instaserfs, a former Uber driver named Mansour gave a chilling description of the new, computer-mediated workplace. First, the company tried to persuade him to take a predatory loan to buy a new car. Apparently a number cruncher deemed him at high risk of defaulting. Second, Uber would never respond in person to him – it just sent text messages and emails. This style of supervision was a series of take-it-or-leave-it ultimatums – a digital boss coded in advance. Then the company suddenly took a larger cut of revenues from him and other drivers. And finally, what seemed most outrageous to Mansour: his job could be terminated without notice if a few passengers gave him one-star reviews, since that could drag his average below 4.7. According to him, Uber has no real appeal recourse or other due process in play for a rating system that can instantly put a driver out of work – it simply crunches the numbers.
For wines or films, the stakes are not terribly high. But when algorithms start affecting critical opportunities for employment, career advancement, health, credit and education, they deserve more scrutiny. US hospitals are using big data-driven systems to determine which patients are high-risk – and data far outside traditional health records is informing those determinations. IBM now uses algorithmic assessment tools to sort employees worldwide on criteria of cost-effectiveness, but spares top managers the same invasive surveillance and ranking. In government, too, algorithmic assessments of dangerousness can lead to longer sentences for convicts, or no-fly lists for travellers. Credit-scoring drives billions of dollars in lending, but the scorers’ methods remain opaque. The average borrower could lose tens of thousands of dollars over a lifetime, thanks to wrong or unfairly processed data.
Link 2: US No-Fly List Uses ‘Predictive Judgement’ Instead of Hard Evidence
The Guardian reports that in a little-noticed filing before an Oregon federal judge, the US Justice Department and the FBI conceded that stopping U.S. and other citizens from traveling on airplanes is a matter of “predictive assessments about potential threats.” “By it’s very nature, identifying individuals who ‘may be a threat to civil aviation or national security’ is a predictive judgment intended to prevent future acts of terrorism in an uncertain context,” Justice Department officials Benjamin C Mizer and Anthony J Coppolino told the court. It is believed to be the government’s most direct acknowledgment to date that people are not allowed to fly because of what the government believes they might do and not what they have already done. The ACLU has asked Judge Anna Brown to conduct her own review of the error rate in the government’s predictions modeling – a process the ACLU likens to the “pre-crime” of Philip K Dick’s science fiction. “It has been nearly five years since plaintiffs on the no-fly list filed this case seeking a fair process by which to clear their names and regain a right that most other Americans take for granted,” say ACLU lawyers.
The Obama administration is seeking to block the release of further information about how the predictions are made, as damaging to national security. “If the Government were required to provide full notice of its reasons for placing an individual on the No Fly List and to turn over all evidence (both incriminating and exculpatory) supporting the No Fly determination, the No Fly redress process would place highly sensitive national security information directly in the hands of terrorist organizations and other adversaries,” says the assistant director of the FBI’s counterterrorism division, Michael Steinbach.
Link 3: Data-Crunching Could Kill Your Downtime At Work
How many of you are reading this at work? One of the unspoken perks of many white-collar jobs is that you can waste time while still appearing productive. Workplaces are aware that this goes on, and they police it to some extent by blocking Facebook or simply looking over your shoulder — but there’s only so much they can do. The new generation of workplace analytics software is starting to change that. “Employers of all types — old-line manufacturers, nonprofits, universities, digital start-ups and retailers — are using an increasingly wide range of tools to monitor workers’ efforts, help them focus, cheer them on and just make sure they show up on time.” This inevitably leads to the question: does cracking the whip more often actually increase productivity? To hear the makers of this software tell it, the value is almost limitless, and it will never be misused to micromanage your job. But the article lacks any independent support for that idea, and I’m sure many of you could provide examples where time-keeping software has only been a hindrance.
What do you think? Comments?