As a growing number of governments and companies are using artificial intelligence (AI) systems to identify individuals, concerns are being realised surround presumed gender identity and sexual orientation.
AI will function of, mostly, visible traits such as appearance, behaviourisms and how “male” or “female” your name may appear.
The default categories for AI fall into specific gender binaries and LGBTQ+ organisations are concerned this could enable discrimination.
Many are worried real-time situations could be complicated by the presumptions of AI. For example, attending an airport and your appearance may not align with AI’s preconceived notion of “male” or “female”.