Navigation auf zora.uzh.ch

Search ZORA

ZORA (Zurich Open Repository and Archive)

Enforcing Group Fairness in Algorithmic Decision Making: Utility Maximization Under Sufficiency

Baumann, Joachim; Hannák, Anikó; Heitz, Christoph (2022). Enforcing Group Fairness in Algorithmic Decision Making: Utility Maximization Under Sufficiency. In: FAccT '22: 2022 ACM Conference on Fairness, Accountability, and Transparency, Seoul Republic of Korea, 21 June 2022 - 24 June 2022. ACM, 2315-2326.

Abstract

Binary decision making classifiers are not fair by default. Fairness requirements are an additional element to the decision making rationale, which is typically driven by maximizing some utility function. In that sense, algorithmic fairness can be formulated as a constrained optimization problem. This paper contributes to the discussion on how to implement fairness, focusing on the fairness concepts of positive predictive value (PPV) parity, false omission rate (FOR) parity, and sufficiency (which combines the former two).

We show that group-specific threshold rules are optimal for PPV parity and FOR parity, similar to well-known results for other group fairness criteria. However, depending on the underlying population distributions and the utility function, we find that sometimes an upper-bound threshold rule for one group is optimal: utility maximization under PPV parity (or FOR parity) might thus lead to selecting the individuals with the smallest utility for one group, instead of selecting the most promising individuals. This result is counter-intuitive and in contrast to the analogous solutions for statistical parity and equality of opportunity.

We also provide a solution for the optimal decision rules satisfying the fairness constraint sufficiency. We show that more complex decision rules are required and that this leads to within-group unfairness for all but one of the groups. We illustrate our findings based on simulated and real data.

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
08 Research Priority Programs > Digital Society Initiative
Dewey Decimal Classification:000 Computer science, knowledge & systems
Scopus Subject Areas:Physical Sciences > Software
Physical Sciences > Human-Computer Interaction
Physical Sciences > Computer Vision and Pattern Recognition
Physical Sciences > Computer Networks and Communications
Scope:Discipline-based scholarship (basic research)
Language:English
Event End Date:24 June 2022
Deposited On:25 Oct 2022 11:32
Last Modified:19 Mar 2025 04:44
Publisher:ACM
OA Status:Hybrid
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1145/3531146.3534645
Other Identification Number:merlin-id:22863
Project Information:
  • Funder: SNSF
  • Grant ID: 407740_187473
  • Project Title: Socially acceptable AI and fairness trade-offs in predictive analytics
Download PDF  'Enforcing Group Fairness in Algorithmic Decision Making: Utility Maximization Under Sufficiency'.
Preview
  • Content: Published Version
  • Language: English
  • Publisher License
Download PDF  'Enforcing Group Fairness in Algorithmic Decision Making: Utility Maximization Under Sufficiency'.
Preview
  • Content: Accepted Version
  • Publisher License

Metadata Export

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

48 downloads since deposited on 25 Oct 2022
17 downloads since 12 months
Detailed statistics

Authors, Affiliations, Collaborations

Similar Publications