By the same authors

From the same journal

Proving algorithmic discrimination in government decision-making

Research output: Contribution to journalArticlepeer-review

Standard

Proving algorithmic discrimination in government decision-making. / Tomlinson, Joe; Maxwell, Jack.

In: OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL, Vol. 20, No. 2, 21.10.2020, p. 352-360 .

Research output: Contribution to journalArticlepeer-review

Harvard

Tomlinson, J & Maxwell, J 2020, 'Proving algorithmic discrimination in government decision-making', OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL, vol. 20, no. 2, pp. 352-360 .

APA

Tomlinson, J., & Maxwell, J. (2020). Proving algorithmic discrimination in government decision-making. OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL, 20(2), 352-360 .

Vancouver

Tomlinson J, Maxwell J. Proving algorithmic discrimination in government decision-making. OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL. 2020 Oct 21;20(2):352-360 .

Author

Tomlinson, Joe ; Maxwell, Jack. / Proving algorithmic discrimination in government decision-making. In: OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL. 2020 ; Vol. 20, No. 2. pp. 352-360 .

Bibtex - Download

@article{36e3dfdfe4864131b227154b08672940,
title = "Proving algorithmic discrimination in government decision-making",
abstract = "Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police [2020] EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.",
author = "Joe Tomlinson and Jack Maxwell",
note = "{\textcopyright} 2020 Faculty of Law, Oxford University. This is an author-produced version of the published paper. Uploaded in accordance with the publisher{\textquoteright}s self-archiving policy. Further copying may not be permitted; contact the publisher for details.",
year = "2020",
month = oct,
day = "21",
language = "English",
volume = "20",
pages = "352--360 ",
journal = "OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL",
issn = "1472-9342",
publisher = "Taylor and Francis Ltd.",
number = "2",

}

RIS (suitable for import to EndNote) - Download

TY - JOUR

T1 - Proving algorithmic discrimination in government decision-making

AU - Tomlinson, Joe

AU - Maxwell, Jack

N1 - © 2020 Faculty of Law, Oxford University. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details.

PY - 2020/10/21

Y1 - 2020/10/21

N2 - Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police [2020] EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.

AB - Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police [2020] EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.

M3 - Article

VL - 20

SP - 352

EP - 360

JO - OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL

JF - OXFORD UNIVERSITY COMMONWEALTH LAW JOURNAL

SN - 1472-9342

IS - 2

ER -