Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police  EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.
|Number of pages||8|
|Journal||OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL|
|Early online date||21 Oct 2020|
|Publication status||E-pub ahead of print - 21 Oct 2020|