| INTRODUCTION AND BACKGROUNDMachine learning (ML) methods are now commonly used to make automated predictions about human beings-their lives and their characteristics. Vast amounts of individual data are aggregated to make predictions about people's shopping preferences, health status, or likelihood to recommit a crime. Computer vision, an ML task for training a computer to metaphorically 'see' specific objects, is a pertinent domain for examining the interaction between ML and human identity. Facial analysis (FA), a subset of computer vision trained to complete tasks like facial classification and facial recognition, is trained to read visual data to make classifications about innate human identities. Identities like age (Lin et al., 2006), gender (Khan et al., 2013), ethnicity (Lu & Jain, 2004) and even sexual orientation (Wang & Kosinski, 2017). Often, decisions about identity characteristics are made without explicit user input-or even user knowledge. Users, effectively, become 'targets' of the system, having no ability to contest these classifications. Surrounding these identity classifications are concerns about bias (e.g., Buolamwini & Gebru, 2018), representation (e.g., Hamidi et al., 2018;Keyes, 2018) and the embracing of pseudoscientific practices like physiognomy (e.g., Agüera y Arcas et al., 2017).In this short paper, I present several considerations for contestability for computer vision. By contestability, I refer to the agency that an individual has to contest the inputs and outputs of a computer vision system-including how one's data is collected, defined and used. I specifically focus on one identity trait for which to ground consideration: gender. Gender is a salient characteristic to consider given that criticisms of computer vision have stemmed from concerns of both sexism and cissexism, discrimination against transgender and nonbinary communities (Hibbs, 2014). Gender in computer vision has largely been presented as binary (i.e., male vs. female) and has been exclusive of genders beyond the cisgender norm (e.g., in automatic gender recognition (AGR) systems that classify gender explicitly [Hamidi et al., 2018;