Journal | OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL |
---|
Date | Accepted/In press - 8 Jul 2020 |
---|
Date | E-pub ahead of print (current) - 21 Oct 2020 |
---|
Issue number | 2 |
---|
Volume | 20 |
---|
Number of pages | 8 |
---|
Pages (from-to) | 352-360 |
---|
Early online date | 21/10/20 |
---|
Original language | English |
---|
Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police [2020] EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.
© 2020 Faculty of Law, Oxford University. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details.