Gender Perception

(Editor:   Nancy Dess)

Introduction

Which facial features lead to classification of a face as male or female? The eye and brow region have been identified as particularly influential when viewing a face straight on (Campbell, Benson, Wallace, Doesbergh, & Coleman, 1999). Specifically, the vertical distance between the eye and the brow affect the gender classification of pictures of faces directly facing the camera.

Children and adults often report that they base their judgments of gender on features such as hair length or clothing. However, hair length can be long or short on a man or a woman, and men and women do wear similar types of clothing. Despite the potential for confusion, people reliably categorize individuals in the absence of information about hair or clothing. This study explores the role of facial features in cueing gender classification.

In this study, participants view photographs of faces (hair and neck covered) presented in three conditions: 1) full view - the full face is visible, 2) eyes only - only the eye and brow region are visible, 3) mouth only - only the mouth and chin regions are visible. Participants are asked to identify the gender of each stimulus and to indicate their confidence in their judgment.

Sample image from the gender perception experiment Sample image from the FirstImpressions experiment Sample image from the gender perception experiment
Figure A Figure B Figure C

Design

There are two independent variables:. facial cues, with three levels (i.e., full face, eyes only, mouth only), and , gender, with two levels (male faces, female faces). This design produces the six different conditions defined by this 3 x 2 within-subject factorial design, 8 male and 8 female models appear with the full face view, eyes only view, and mouth only view.

Participants identify whether the face is male or female and rate their confidence in this evaluation. Thus, the dependent variables are accuracy in identification of gender of faces, and confidence in the accuracy of responses.

Data Analyses

The data file includes the following variables for each participant, in this order: UserID; ClassID; participant gender, age, and hand preference; date of data collection; number of correctly classified faces followed by associated confidence rating for photos of female "eyes," female "mouth," female "full face," male "eyes," male "mouth," and male "full face." An excerpt of the data is shown in Figure 1.

Sample data image from the Gender Perception experiment
Figure 1

Applications/Extensions

In this experiment, gender is treated as a dichotomous (either/or) variable: face gender (IV) is defined as either male or female, and face perception (DV) is measured as 'male' or 'female.' In many cultures, species and situations, gender often appears to function this way. For instance, when asked to check a box "M" for male or "F" for female on an application form, most people answer effortlessly. However, gender can be experienced, conceptualized, and operationally defined as consisting of more than two categories (e.g. male, female, intersexed) or as varying along one or more continuous dimensions (e.g. masculinity, femininity, androgyny). Because gender powerfully shapes people's feelings about themselves and others and their roles in society, careful consideration of the possible meanings of gender and implications for its scientific study is warranted. An additional issue relevant here concerns how faces differ other than gender. Attractiveness, age, ethnicity, and other variables influence face perception and could influence how easily or confidently faces are categorized as male or female. Thus, controlling for variables such as these may affect the outcome of studies of gender perception.

References


Campbell, R. Benson, P.F., Wallace, S. B., Doesbergh, S., & Coleman, M. (1999). 
	More about brows: How poses that change brow position affect 
	perceptions of gender. Perception, 28, 489-504.
	
Carey, S., & Diamond, R. (1977). From piecemeal to configural representation 
	of faces. Science, 195, 312-314.