Abstract
Detecting, measuring and mitigating various measures of unfairness are core aims of algorithmic fairness research. However, the most prominent approaches require access to individual level demographic information, such as sex or race. In practice, such information is often inaccurate, incomplete or entirely absent. A wealth of techniques have been proposed in response, such as proxies, trusted third parties and cryptographic solutions. These vary greatly in their assumptions, aims and costs, making it challenging to ascertain which methods might be appropriate for a given setting. We aim to clarify the landscape by providing an overview of the different proposals, surfacing the key dimensions in which they differ, and briefly discussing associated benefits, limitations and ethical considerations. We hope this will be of benefit to researchers, practitioners and regulators attempting to navigate this space.
Citation information
Carolyn Ashurst and Adrian Weller. 2023. Fairness Without Demographic Data: A Survey of Approaches. In Proceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO '23). Association for Computing Machinery, New York, NY, USA, Article 14, 1–12. https://doi.org/10.1145/3617694.3623234