- Research
- 12/06/2019
Prof Talk | Discrimination by algorithms
Algorithms are playing an ever more important role in our lives: generating all manner of services from search engines to a personalized Netflix offering. Without realizing it, you are being monitored all day long and the data you leave behind is used to put you in boxes. Similarly the government is making increasing use of algorithms and with this comes the risk of discrimination, reported broadcaster NOS last week. Algorithm experts Jack van Wijk and Mykola Pechenizkiy, both involved in the Data Science Center Eindhoven, stress the need for transparency and advise caution.
It all seems so handy. The Spotify list of song tips, the selected messages you find on your Facebook timeline or the automatically generated weekly shopping lists. Our daily life seems to be increasingly shaped by algorithms. Very simply put, step-by-step recipes for achieving a particular aim, fed by a mountain of data entered into the system.
But the drawbacks of these smart formulas are equally evident. Based on what you search for, buy and 'like', a profile of you builds up, and what happens to this profile is not always clear. Besides, your view of the world is quite heavily tinted by what social media show you, so that you end up in a filter bubble of one-sided news.
Right to be treated equally
Decisions are increasingly being automated, including those made by government agencies. This isn't all bad, professor of Information Visualization Jack van Wijk is quick to point out. “It makes it easier to organize data and often saves a great deal of time. Nonetheless, we need to be on our guard where these automated decision-making systems are concerned. Because there's a real risk that algorithms will infringe people's right to be treated equally.”
One of the chief reasons for this is that algorithms use data from the past, explains Mykola Pechenizkiy, professor of Data Mining. “Algorithms are trained using historical data and those data contain the prejudices of everyone who made these earlier decisions. And, what's more, this can be self-reinforcing. In a well-known example, the police are guided by an algorithm to carry out extra patrols in a certain neighborhood - typically home to people with a lower income or particular ethnicity - make more arrests and are then sent back to that same neighborhood. Human intervention is very much needed to prevent this type of discrimination.”
The computer decides, but evidently people must be held accountable for these decisions. We must be alert to the fact that things could go seriously wrong, warns Van Wijk. “As long as the black box stays shut, it is very difficult to see exactly what is happening. That's why transparency must be our watchword. Computer scientists, ethicists and lawyers must study not only what is taking place within the algorithm, but also what the procedures for before and after it involve. It's not only the algorithm itself, the data used must also be transparent: where does it come from, how is it linked, filtered and selected? Garbage in, garbage out. Or perhaps even better: Garbage in, toxic waste out.”
Similarly, the Data Science Center Eindhoven also has a strong focus on transparency. Pechenizkiy: “We recently set up the new research program Responsible Data Science, within which we taking a very multidisciplinary approach to issues like these. The science technical puzzles, as well as the legal and ethical aspects; it is complex material. With projects carried out in cooperation with, for example, the National Police Corps or Rabobank, we can do our bit towards improving how algorithms are used in a wide range of areas.”
Screening of algorithms
The screening of algorithms should become standard practice for government bodies, both gentlemen believe. Van Wijk: “I expect algorithm auditing to take off in a big way. It is only natural that government bodies should be given permission to use a certain algorithm only after carefully selected parties have carried out an external study. And we're not talking only about the piece of code that spits out the decision; the whole process leading up to that and following it needs to be examined. In addition, as citizens we also have to realize that it is very dangerous to have blind faith in what a computer puts on the screen. Try to ask probing questions if you don't trust why you are being required to jump through extra hoops when applying for a grant or social benefit. Why me, what rules are you people using? It is not always easy to stand up to the bureaucratic machinery, but fortunately all kinds of agencies are monitoring this closely. And for anyone who doesn't want to become entangled in the algorithms of daily life: remember that you're leaving more tracks on the web than you realize.”
Discussion