Algorithmic fairness through group parities? : the case of COMPAS-SAPMOC
dc.contributor.author | LAGIOIA, Francesca | |
dc.contributor.author | ROVATTI, Riccardo | |
dc.contributor.author | SARTOR, Giovanni | |
dc.date.accessioned | 2024-01-18T16:00:30Z | |
dc.date.available | 2024-01-18T16:00:30Z | |
dc.date.issued | 2023 | |
dc.description | Published online: 28 April 2022 | en |
dc.description.abstract | Machine learning classifiers are increasingly used to inform, or even make, decisions significantly affecting human lives. Fairness concerns have spawned a number of contributions aimed at both identifying and addressing unfairness in algorithmic decision-making. This paper critically discusses the adoption of group-parity criteria (e.g., demographic parity, equality of opportunity, treatment equality) as fairness standards. To this end, we evaluate the use of machine learning methods relative to different steps of the decision-making process: assigning a predictive score, linking a classification to the score, and adopting decisions based on the classification. Throughout our inquiry we use the COMPAS system, complemented by a radical simplification of it (our SAPMOC I and SAPMOC II models), as our running examples. Through these examples, we show how a system that is equally accurate for different groups may fail to comply with group-parity standards, owing to different base rates in the population. We discuss the general properties of the statistics determining the satisfaction of group-parity criteria and levels of accuracy. Using the distinction between scoring, classifying, and deciding, we argue that equalisation of classifications/decisions between groups can be achieved thorough group-dependent thresholding. We discuss contexts in which this approach may be meaningful and useful in pursuing policy objectives. We claim that the implementation of group-parity standards should be left to competent human decision-makers, under appropriate scrutiny, since it involves discretionary value-based political choices. Accordingly, predictive systems should be designed in such a way that relevant policy goals can be transparently implemented. Our paper presents three main contributions: (1) it addresses a complex predictive system through the lens of simplified toy models; (2) it argues for selective policy interventions on the different steps of automated decision-making; (3) it points to the limited significance of statistical notions of fairness to achieve social goals. | en |
dc.description.sponsorship | Francesca Lagioia and Giovanni Sartor have been supported by the H2020 European Research Council (ERC) Project CompuLaw under the European Union’s Horizon 2020 research and innovation programme (Grant Agreement no. 833647) and by the European Union’s Justice Programme (2014-2020) Project ADELE: Analytics for DEcision of LEgal cases (Grant Agreement no. 101007420). Open access funding provided by Alma Mater Studiorum - Università di Bologna within the CRUI-CARE Agreement. | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.citation | AI and society, 2023, Vol. 38, pp. 459-478 | en |
dc.identifier.doi | 10.1007/s00146-022-01441-y | |
dc.identifier.endpage | 478 | en |
dc.identifier.issn | 0951-5666 | |
dc.identifier.issn | 1435-5655 | |
dc.identifier.startpage | 459 | en |
dc.identifier.uri | https://hdl.handle.net/1814/76330 | |
dc.identifier.volume | 38 | en |
dc.language.iso | en | en |
dc.orcid.upload | true | * |
dc.publisher | Springer | en |
dc.relation | Computable Law | |
dc.relation.ispartof | AI and society | en |
dc.rights | info:eu-repo/semantics/openAccess | en |
dc.rights.license | Attribution 4.0 International | * |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
dc.title | Algorithmic fairness through group parities? : the case of COMPAS-SAPMOC | en |
dc.type | Article | en |
dspace.entity.type | Publication | |
person.identifier.orcid | 0000-0001-7083-3487 | |
person.identifier.orcid | 0000-0003-2210-0398 | |
person.identifier.other | 39527 | |
person.identifier.other | 29040 | |
relation.isAuthorOfPublication | 11b153fb-d344-4c09-a461-c79ecf87e497 | |
relation.isAuthorOfPublication | e0511c58-5007-4b10-8d82-e19ca2df8cc2 | |
relation.isAuthorOfPublication.latestForDiscovery | 11b153fb-d344-4c09-a461-c79ecf87e497 | |
relation.isProjectOfPublication | bba7f606-01cf-4278-ae65-0157c5db3bf5 | |
relation.isProjectOfPublication.latestForDiscovery | bba7f606-01cf-4278-ae65-0157c5db3bf5 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Algorithmic_fairness_2023.pdf
- Size:
- 2.16 MB
- Format:
- Adobe Portable Document Format
- Description:
- Full-text in Open Access, Published version
License bundle
1 - 1 of 1

- Name:
- license.txt
- Size:
- 3.83 KB
- Format:
- Item-specific license agreed upon to submission
- Description: