Diversity & AI

Diversity & AI

Diversity & AI investigates from an intersectional approach how robots, AI and algorithmic decision-making processes impacts different communities, including the LGBTQIA+.

Poulsen, A., Fosch-Villaronga, E., & Søraa, R.A. (2020) Queering Machines. Nature Machine Intelligence, Correspondence, 1-1.

s42256-020-0157-6.pdf

Gender inferences in social media

Fosch-Villaronga, E., Poulsen, A., Søraa, R. A., & Custers, B. H. M. (2020) A little bird told me your gender: Gender inferences in social media. Information Processing and Management, 58(3), 102541.

1-s2.0-S0306457321000480-main (1).pdf

Poulsen, A., Fosch-Villaronga, E., Burmeister, O. K. (2020) Cybersecurity, value sensing robots for LGBTIQ+ elderly, and the need for revised codes of conduct. Australasian Journal of Information Systems 24, 1-16.

Until now, each profession has developed their professional codes of conduct independently. However, the use of robots and artificial intelligence is blurring professional delineations: aged care nurses work with lifting robots, tablet computers, and intelligent diagnostic systems, and health information system designers work with clinical teams. While robots assist the medical staff in extending the professional service they provide, it is not clear how professions adhere and adapt to the new reality. In this article, we reflect on how the insertion of robots may shape codes of conduct, in particular with regards to cybersecurity. We do so by focusing on the use of social robots for helping LGBTIQ+ elderly cope with loneliness and depression. Using robots in such a delicate domain of application changes how care is delivered, as now alongside the caregiver, there is a cyber-physical health information system that can learn from experience and act autonomously. Our contribution stresses the importance of including cybersecurity considerations in codes of conduct for both robot developers and caregivers as it is the human and not the machine which is responsible for ensuring the system’s security and the user’s safety.

Fosch-Villaronga, E., Poulsen, A., Søraa, R. A., & Custers, B. H. M. (2020) Don’t guess my gender, gurl: The inadvertent impact of gender inferences. BIAS 2020: Bias and Fairness in AI Workshop at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 14-18 September 2020, online

Social media platforms employ inferential analytics methods to guess user preferences and may include sensitive attributes such as race, gender, sexual orientation, and political opinions. These methods are often opaque, but they can have significant effects such as predicting behaviors for marketing purposes, influencing behavior for profit, serving attention economics, and reinforcing existing biases such as gender stereotyping. Although two international human rights treaties include express obligations relating to harmful and wrongful stereotyping, these stereotypes persist both online and offline, and platforms often appear to fail to understand that gender is not merely a binary of being a 'man' or a 'woman,' but is socially constructed. Our study investigates the impact of algorithmic bias on inadvertent privacy violations and the reinforcement of social prejudices of gender and sexuality through a multidisciplinary perspective including legal, computer science, and queer media viewpoints. We conducted an online survey to understand whether and how Twitter inferred the gender of users. Beyond Twitter's binary understanding of gender and the inevitability of the gender inference as part of Twitter's personalization trade-off, the results show that Twitter misgendered users in nearly 20% of the cases (N=109). Although not apparently correlated, only 8% of the straight male respondents were misgendered, compared to 25% of gay men and 16% of straight women. Our contribution shows how the lack of attention to gender in gender classifiers exacerbates existing biases and affects marginalized communities. With our paper, we hope to promote the online account for privacy, diversity, and inclusion and advocate for the freedom of identity that everyone should have online and offline.

Søraa, R. A. & Fosch-Villaronga, E., (2020) Exoskeletons for all: The interplay between exoskeletons, inclusion, gender and intersectionality. Paladyn Journal of Behavioral Robotics 11(1), 217-227

In this article, we investigate the relation between gender and exoskeleton development through the lens of intersectionality theory. Exoskeleton users come in a wide variety of shapes, sizes, and genders. However, it is often the case that wearable robot engineers do not develop such devices primarily on the premise that the product should fit as many end users as possible. Instead, designers tend to use the one-size-fits-all approach – a design choice that seems legitimate from the return of an investment viewpoint but that may not do as much justice to end users. Intended users of exoskeletons have a series of user criteria, including height, weight, and health condition, in the case of rehabilitation. By having rigid inclusion criteria for whom the intended user of the technology can be, the exclusion criteria will grow in parallel. The implications and deep-rootedness of gender and diversity considerations in practices and structural systems have been largely disregarded. Mechanical and robot technology were historically seen as part of a distinct male sphere, and the criteria used today to develop new technology may reflect the biases that existed in another time that should no longer be valid. To make this technology available for all, we suggest some tools to designers and manufacturers to help them think beyond their target market and be more inclusive.