Here’s How We Can Protect Ourselves From The Hidden Algorithms That Influence Our Lives
Alan Reid, 26 Feb 17
       

Shutterstock

In political terms, 2016 has been a year of uncertainty. Yet, it has also seen the rising dominance of algorithms, complex mathematical calculations that follow a pre-set pattern and are increasingly used in technology designed to predict, control and alter human behaviour.

Algorithms try to use the past as an indicator of the future. As such, they are neutral. They do not have prejudices and are unemotional. But algorithms can be programmed to be biased or unintentional bias can creep into the system. They also allow large corporations to make largely hidden decisions about how they treat consumers and their employees. And they allow government organisations to decide how to distribute services and even justice.

The danger of algorithms being used unfairly or even illegally has led to recent calls by the UK Labour party for greater regulation not just of tech firms but of the algorithms themselves. But what would tighter rules on algorithms actually cover? Is it even possible to regulate such a complex area of technology?

Algorithms are used by governments and corporations alike to try and foresee the future and inform decision making. Google, for example, uses algorithms to auto-fill its search box as you type into it and to rank the websites it lists after you hit the return button, directing you to certain websites over others. Self-driving cars use algorithms to decide their route and speed, and potentially even whom to run over in an emergency situation.

Financial corporations use algorithms to assess your risk profile, to determine whether they should give you a loan, credit card or insurance. If you are lucky enough to be offered one of their products, they will then work out how much you should pay for that product. Employers do the same to select the best candidates for the job and to assess their workers’ productivity and abilities.

Even governments around the world are becoming big adopters of algorithms. Predictive policing algorithms allow the police to focus limited resources on crime hotspots. Border security officials use algorithms to determine who should be on a no-fly list. Judges could soon use algorithms to determine the re-offending risk of an offender and select the most appropriate sentence.

Driving by algorithm. Shutterstock

Given this extensive influence algorithms now have over our lives, it’s not surprising that politicians would like to bring them under greater control. But algorithms are usually commercially sensitive and highly lucrative. Corporations and government organisations will want to keep the exact terms of how their algorithms work a secret. They may be protected by intellectual property rights such as patents and confidentiality agreements. So the ability to regulate the actual algorithms themselves will be extremely difficult to achieve.

This hidden nature of algorithms might itself be a fruitful source of regulation. The law could be amended to force all companies and government agencies to more widely publicise the fact that decision making in the organisation will be taken by way of an algorithm. But such an approach would only serve to improve transparency. It would do nothing to regulate the actual algorithmic process. So the focus on regulation would need to shift to the inputs and the outputs of the algorithm.

In the UK, the current law of judicial review would be enough to cover the inputting of data into algorithms by governmental bodies. Judicial review allows judges to assess the legality of decisions taken by public bodies. So judges could determine whether the data inputted into the algorithm was correct, relevant and reasonable. The ultimate decision taken by the public body based on the output given by the algorithm would also be subject to judicial review, asking whether the final decision was proportionate, lawful and reasonable.

Sign in to view full article

       
How Hot-Deskers are Made to Feel Like the Homeless People of the Office World
If you work in an open-plan, hot-desking environment, you have probably at some point found yourself trudging through the office, ...
Alison Hirst
Thu, 16 Feb 17
Enough’s Enough: Buying More Stuff Isn’t Always the Answer to Happiness
The average German household contains 10,000 items. That’s according to a study cited by Frank Trentmann in his sweeping history ...
Anthony James
Thu, 5 Jan 17
Norway’s Oil Fund Is A Tarnished Gold Standard For Sustainable Investment
The largest sovereign wealth fund in the world, Norway’s US$930 billion Government Pension Fund Global, is seen as the epitome ...
Beate Sjåfjell
Thu, 4 May 17
Is Violent Political Protest Ever Justified?
The mass protests against Donald Trump’s election, inauguration, and executive actions might subside – but based on the scale and ...
Christopher J. Finlay
Thu, 30 Mar 17
Gut Check: Researchers Develop Measures to Capture Moral Judgments and Empathy
Imagine picking up the morning newspaper and feeling moral outrage at the latest action taken by the opposing political party. ...
C. Daryl Cameron
Sat, 1 Apr 17
An Epoch Times Survey
An Epoch Times Survey
AcuSLIM - Acupuncture Weight Loss Programme
BUCHERER
Sports Elements
Read about Forced Organ Harvesting