Skip to Main Content

Face Image Databases

An A-Z directory of databases of face stimulus images for use in behavioral research

Introduction

The following is a directory of databases containing face stimulus sets available for use in behavioral studies. Please read the rights, permissions, licensing information on the database's webpage before proceeding with use. Make sure to obtain the permissions required and credit/cite as requested by the creators.

Last updated: 20 JAN 2022

10k US Adult Faces Database

This database contains 10,168 natural face photographs and several measures for 2,222 of the faces, including memorability scores, computer vision and psychology attributes, and landmark point annotations. The face photographs are JPEGs with 72 pixels/in resolution and 256-pixel height. 

Citation: Bainbridge, W.A., Isola, P., & Oliva, A. (2013). The intrinsic memorability of face images. Journal of Experimental Psychology: General. Journal of Experimental Psychology: General, 142(4), 1323-1334.

Contact: brainbridgelab@gmail.com

American Multiracial Face Database

The American Multiracial Face Database contains 110 faces (smiling and neutral expression poses) with mixed-race heritage and accompanying ratings of those faces by naive observers that are freely available to academic researchers. The faces were rated on attractiveness, emotional expression, racial ambiguity, masculinity, racial group membership(s), gender group membership(s), warmth, competence, dominance, and trustworthiness. 

Citation: Chen, J.M., Norman, J.B. & Nam, Y. Broadening the stimulus set: Introducing the American Multiracial Faces Database. Behav Res (2020). https://doi.org/10.3758/s13428-020-01447-8

Amsterdam Dynamic Facial Expression Set (ADFES)

The ADFES is a rich stimulus set comprising 648 filmed emotional expressions. The set features displays of nine emotions: the six ‘basic’ emotions (anger, disgust, fear, joy, sadness, and surprise), as well as contempt, pride and embarrassment expressed by 22 Northern-European and Mediterranean models (10 female, 12 male).

Citation: Van der Schalk, J., Hawk, S. T., Fischer, A. H., & Doosje, B. J. (in press).Moving faces, looking places: The Amsterdam Dynamic Facial Expressions Set (ADFES), Emotion.

Contact: Agneta Fisher, a.h.fischer@uva.nl

AT&T Databases of Faces (formerly ORL Database of Faces)

Our Database of Faces, (formerly 'The ORL Database of Faces'), contains a set of face images taken between April 1992 and April 1994 at the lab. There are ten different images of each of 40 distinct subjects. For some subjects, the images were taken at different times, varying the lighting, facial expressions (open / closed eyes, smiling / not smiling) and facial details (glasses / no glasses). All the images were taken against a dark homogeneous background with the subjects in an upright, frontal position (with tolerance for some side movement).

Citation: Samaria, F. S. (1994). Face recognition using hidden Markov models (Doctoral dissertation, University of Cambridge).

Contact: AT&T Laboratories Cambridge

Basel Face Database (BFD)

The Basel Face Database (BFD) is built upon portrait photographs of forty different individuals. All these photographs have been manipulated to appear more or less agentic and communal (Big Two personality dimensions) as well as open to experience, conscientious, extraverted, agreeable, and neurotic (Big Five personality dimensions). Thus, the database consists of forty photographs of different individuals and 14 variations of each of them signaling different personalities. Using this database therefore allows to investigate the impact of personality on different outcome variables in a very systematic way.

Citation: Walker, M., Schönborn, S., Greifeneder, R., & Vetter, T. (2018). The Basel Face Database: A validated set of photographs reflecting systematic differences in Big Two and Big Five personality dimensions. PloS one, 13(3). doi: https://doi.org/10.1371/journal.pone.0193190

Contact: Mirella Walker

Bogazici Face Database

The Bogazici Face Database is a database of Turkish undergraduate student targets. High-resolution standardized photographs were taken and supported by the following materials: (a) basic demographic and appearance-related information, (b) two types of landmark configurations (for Webmorph and geometric morphometrics (GM)), (c) facial width-to-height ratio (fWHR) measurement, (d) information on photography parameters, (e) perceptual norms provided by raters.

Citation: Saribay SA, Biten AF, Meral EO, Aldan P, Třebický V, Kleisner K (2018) The Bogazici face database: Standardized photographs of Turkish faces with supporting materials. PLoS ONE 13(2): e0192018. https://doi.org/10.1371/journal.pone.0192018

CalTech 10k Web Faces

The dataset contains images of people collected from the web by typing common given names into  Google Image Search. The coordinates of the eyes, the nose and the center of the mouth for each frontal face are provided in a ground truth file. This information can be used to align and crop the human faces or as a ground truth for a face detection algorithm. The dataset has 10,524 human faces of various resolutions and in different settings, e.g. portrait images, groups of people, etc. Profile faces or very low-resolution faces are not labeled.

Citation: Anelia Angelova, Yaser Abu-Mostafa, Pietro Perona, Pruning Training Sets for Learning of Object Categories , Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2005

Contact: Anelia Angelova, anelia@caltech.edu

Chicago Face Database

The Chicago Face Database was developed at the University of Chicago by Debbie S. Ma, Joshua Correll, and Bernd Wittenbrink. The CFD is intended for use in scientific research. It provides high-resolution, standardized photographs of male and female faces of varying ethnicity between the ages of 17-65. Extensive norming data are available for each individual model. These data include both physical attributes (e.g., face size) as well as subjective ratings by independent judges (e.g., attractiveness). The database consists of a main image set and several extension sets.  

CFD

  • The main CFD set consists of images of 597 unique individuals. They include self-identified Asian, Black, Latino, and White female and male models, recruited in the United States. All models are represented with neutral facial expressions. A subset of the models is also available with happy (open mouth), happy (closed mouth), angry, and fearful expressions. 
  • Citation: Ma, D. S., Correll, J., & Wittenbrink, B. (2015). The Chicago face database: A free stimulus set of faces and norming data. Behavior research methods, 47(4), 1122-1135.

CFD-MR

  • The CFD-MR extension set includes images of 88 unique individuals, who self-reported multiracial ancestry. All models were recruited in the United States. The images depict models with neutral facial expressions. Additional facial expression images with happy (open mouth), happy (closed mouth), angry, and fearful expressions are in production and will become available with a future update of the database.
  • Citation: Ma, Kantner, & Wittenbrink, (2020). Chicago Face Database: Multiracial Expansion. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01482-5.

CFD-INDIA

  • The CFD-INDIA extension set includes images of 142 unique individuals, recruited in Delhi, India. The images depict models with neutral facial expressions. Additional facial expression images with happy (open mouth), happy (closed mouth), angry, and fearful expressionsare in production and will become available with a future update of the database. 
  • Citation: Lakshmi, Wittenbrink, Correll, & Ma (2020). The India Face Set: International and Cultural Boundaries Impact Face Impressions and Perceptions of Category Membership. Frontiers in Psychology, 12, 161. https://doi.org/10.3389/fpsyg.2021.627678.

Contact: Bernd Wittenbrink, bernd.wittenbrink@chicagobooth.edu

Child Affective Facial Expression Set (CAFE)

The Child Affective Facial Expressions Set (CAFE) is the first large and representative set of children posing a variety of affective facial expressions that can be used for scientific research. The set is made up of nearly 1200 photographs of over 100 children  (ages 2-8) making 7 different facial expressions - happy, angry, sad, fearful, surprise, neutral, and disgust.

Citation: LoBue, V. & Thrasher, C. (2015). The Child Affective Facial Expression (CAFE) Set: Validity and reliability from untrained adults. Frontiers in Emotion Science, 5. 

 

Children Spontaneous Facial Expression Video Database (LIRIS-CSE)

A novel emotional database that contains movie clip / dynamic images of 12 ethnically diverse children. This unique database contains spontaneous / natural facial expression of children in diverse settings with diverse recording scenarios showing six universal or prototypic emotional expressions (happiness, sadness, anger, surprise, disgust and fear). Children are recorded in constraint free environment (no restriction on head movement, no restriction on hands movement, free sitting setting, no restriction of any sort) while they watched specially built / selected stimuli. This constraint free environment allowed us to record spontaneous / natural expression of children as they occur.

Citation: A novel database of Children's Spontaneous Facial Expressions (LIRIS-CSE). Rizwan Ahmed Khan, Crenn Arthur, Alexandre Meyer, Saida Bouakaz. Image and Vision Computing, Volumes 83–84, March–April 2019. arXiv (2018) preprint, arXiv:1812.01555.

Contact: Request Form

City Infant Faces Database

This database contains 60 photographs of positive infant faces, 54 photographs of negative infant faces, and 40 photographs of neutral infant faces. The images have high criterion validity and good test–retest reliability. 

Citation: Webb, R., Ayers, S. & Endress, A. The City Infant Faces Database: A validated set of infant facial expressions. Behav Res 50, 151–159 (2018). https://doi.org/10.3758/s13428-017-0859-9

Contact: Rebecca Webb

CMU Multi-PIE Face Database

The CMU Multi-PIE face database contains more than 750,000 images of 337 people recorded in up to four sessions over the span of five months. Subjects were imaged under 15 viewpoints and 19 illumination conditions while displaying a range of facial expressions.

Citation: Sim, T., Baker, S., & Bsat, M. (2001). The CMU pose, illumination and expression database of human faces. Carnegie Mellon University Technical Report CMU-RI-TR-OI-02.

Contact: Ralph Gross, ralph@multiple.org

Cohn-Kanade AU-Coded Facial Expression Database

The Cohn-Kanade AU-Coded Facial Expression Database affords a test bed for research in automatic facial image analysis and is available for use by the research community. Image data consist of approximately 500 image sequences from 100 subjects. Accompanying meta-data include annotation of FACS action units and emotion-specified expressions. Subjects range in age from 18 to 30 years. Sixty-five percent were female; 15 percent were African-American and three percent Asian or Latino.

Subjects were instructed by an experimenter to perform a series of 23 facial displays that included single action units (e.g., AU 12, or lip corners pulled obliquely) and action unit combinations (e.g., AU 1+2, or inner and outer brows raised).  Each begins from a neutral or nearly neutral face.  For each, an experimenter described and modeled the target display.  Six were based on descriptions of prototypic emotions (i.e., joy, surprise, anger, fear, disgust, and sadness). 

Citation: Kanade, T., Cohn, J. F., & Tian, Y. (2000, March). Comprehensive database for facial expression analysis. In Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580) (pp. 46-53). IEEE.

Contact: Takeo Kanade, kanade@andrew.cmu.edu

Complex Emotion Expression Database (CEED)

The Complex Emotion Expression Database (CEED), a digital stimulus set of 243 basic and 237 complex emotional facial expressions. The stimuli represent six basic expressions (angry, disgusted, fearful, happy, sad, and surprised) and nine complex expressions (affectionate, attracted, betrayed, brokenhearted, contemptuous, desirous, flirtatious, jealous, and lovesick) that were posed by Black and White formally trained, young adult actors.

Citation: Benda MS, Scherf KS (2020) The Complex Emotion Expression Database: A validated stimulus set of trained actors. PLoS ONE 15(2): e0228248. https://doi.org/10.1371/journal.pone.0228248

CVL Database

The Computer Vision Laboratory (CVL) Face Database contains photographs of 114 persons approximately 18 years of age, 7 images per person.

Citation: Mirage 2003, Conference on Computer Vision / Computer Graphics Collaboration for Model-based Imaging, Rendering, image Analysis and Graphical special Effects, March 10-11 2003, INRIA Rocquencourt, France, Wilfried Philips, Rocquencourt, INRIA, 2003, pp. 38-47.

Contact: Peter Peer, peter.peer@fri.uni-lj.si

Dartmouth Database of Children's Faces

The Dartmouth Database of Children's Faces contains images of 40 male and 40 female models between the ages of 6 and 16. Models are photographed on a black background and are wearing black bibs and black hats to cover hair and ears. They are photographed from 5 different camera angles and pose 8 different facial expressions. Models were rated by independent raters and are ranked for the overall believability of their poses.

Citation: Dalrymple, K. A., Gomez, J., & Duchaine, B. (2013). The Dartmouth Database of Children’s Faces: Acquisition and validation of a new face stimulus set. PloS one, 8(11), e79131.

Contact: Kristen Dalrymple, kad@umn.edu

Face Database

The Face Database consists of 575 individual faces ranging from ages 18 to 93. Our database was developed to be more representative of age groups across the lifespan, with a special emphasis on recruiting older adults. The resulting database has faces of 218 adults age 18-29, 76 adults age 30-49, 123 adults age 50-69, and 158 adults age 70 and older.

Citation: Minear, M., & Park, D. C. (2004). A lifespan database of adult facial stimuli. Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc, 36(4), 630–633. https://doi.org/10.3758/bf03206543

Face Image Meta-Database (fIMDb)

An index of face databases, their features, and how to access them has been unavailable. The “Face Image Meta-Database” (fIMDb) provides researchers with the tools to find the face images best suited to their research. The fIMDb is available from: https://cliffordworkman.com/resources/

Citation: Workman, C. I., & Chatterjee, A. (2020, June 24). The Face Image Meta-Database (fIMDb) & ChatLab Facial Anomaly Database (CFAD): Tools for research on face perception and social stigma. https://doi.org/10.1016/j.metip.2021.100063

Face Place(s)

This dataset includes multiple photographs for over 200 individuals of many different races with consistent lighting, multiple views, real emotions, and disguises (and some participants returned for a second session several weeks later with a haircut, or a new beard, etc.). The images are in jpeg format, 250x250 72 dpi 24 bit color.

Citation: Righi, G, Peissig, JJ, & Tarr, MJ (2012) Recognizing disguised faces. Visual Cognition, 20(2), 143-169. doi:10.1080/13506285.2012.654624

Contact: Tarr Lab, Carnegie Mellon University, tarrlab@gmail.com

 

Face Recognition Technology (FERET)

FERET

  • The FERET database was collected in 15 sessions between August 1993 and July 1996. The database contains 1564 sets of images for a total of 14,126 images that includes 1199 individuals and 365 duplicate sets of images. A duplicate set is a second set of images of a person already in the database and was usually taken on a different day.
  • Citation: Phillips, P. J., Martin, A., Wilson, C. L., & Przybocki, M. (2000). An introduction evaluating biometric systems. Computer, 33(2), 56-63.

Color FERET

  • As part of the FERET program, a database of facial imagery was collected between December 1993 and August 1996. The database is used to develop, test, and evaluate face recognition algorithms.

Contact: P. Jonathon Phillips, jonathon.phillips@nist.gov

Face Research Lab - London Set

The London Set contains Images are of 102 adult faces 1350x1350 pixels in full color.

Citation: DeBruine, Lisa; Jones, Benedict (2017): Face Research Lab London Set. figshare. Dataset. https://doi.org/10.6084/m9.figshare.5047666.v5   

Face Research Toolkit (FaReT)

Face Research Toolkit: A free and open-source toolkit of three-dimensional models and software to study face perception. Contains 8 manipulatable facial expression models.

Citation: Hays, J. S., Wong, C., & Soto, F. (2020). FaReT: A free and open-source toolkit of three-dimensional models and software to study face perception. Behavior Research Methods, 5.(6), 2604-2622.

Contact: Fabian Soto, Florida International University

FACES

FACES

  • FACES is a set of images of naturalistic faces of 171 young (n = 58), middle-aged (n = 56), and older (n = 57) women and men displaying each of six facial expressions: neutrality, sadness, disgust, fear, anger, and happiness. The database comprises two sets of pictures per person and per facial expression (a vs. b set), resulting in a total of 2,052 images.
  • Citation: Ebner, N., Riediger, M., & Lindenberger, U. (2010). FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation. Behavior research Methods, 42, 351-362. doi:10.3758/BRM.42.1.351.

Dynamic FACES

  • Dynamic FACES is an extension of the original FACES database. It is a database of morphed videos (n = 1,026) of young, middle-aged, and older adults displaying six naturalistic emotional facial expressions including neutrality, sadness, disgust, fear, anger, and happiness. Static images used for morphing came from the original FACES database. Videos were created by transitioning from a static neutral image to a target emotion. Videos are available in 384 x 480 pixels as .mp4 files or in original size of 1280 x1600 as .mov files.
  • Holland, C. A. C., Ebner, N. C., Lin, T., & Samanez-Larkin, G. R. (2019). Emotion identification across adulthood using the Dynamic FACES database of emotional expressions in younger, middle aged, and older adults. Cognition and Emotion, 33, 245-257. doi:10.1080/02699931.2018.1445981.

Scrambled FACES

  • All 2,052 images from the original FACES database were scrambled using MATLAB. With the randblock function, original FACES files were treated as 800x1000x3 matrices – the third dimension denoting specific RGB values – and partitioned into non-overlapping 2x2x3 blocks. The matrices were then randomly shuffled by these smaller blocks, providing final images that matched the dimensions of the original image and were composed of the same individual pixels, although arranged differently. All scrambled images are 800x1000 jpeg files (96 dpi).

FaceScrub

A dataset with a total of 106,863 face images* of male and female 530 celebrities, with about 200 images per person. As such, it is one of the largest public face databases.

Citation: H.-W. Ng, S. Winkler. A data-driven approach to cleaning large face datasets. Proc. IEEE International Conference on Image Processing (ICIP), Paris, France, Oct. 27-30, 2014.

Contact: Request Form

FAMED Face Database (Video)

The Faces and Motion Exeter Database (FAMED) is a video database of 32 male actors for use in psychological research.  Each actor was filmed from two viewpoints (full-face and three-quarter) whilst they performed a series of facial motions including the telling of three jokes, a short conversation, six facial expressions (smiling, anger, fear, disgust, surprise and sadness) and rigid motion such as head rotation from left to right and up and down.  The actors performed all actions three times; once with no headgear, once wearing a swimming cap to hide hair cues and once whilst wearing a wig.

Citation: Longmore, C. A., & Tree, J. J. (2013). Motion as a cue to face recognition: Evidence from congenital prosopagnosia. Neuropsychologia, 51, 864-875

Contact: Chris Longmore, chris.longmore@plymouth.ac.uk

FEI Face Database

The FEI Face Database is a Brazilian face database that contains a set of face images taken between June 2005 and March 2006 at the Artificial Intelligence Laboratory of FEI in São Bernardo do Campo, São Paulo, Brazil. There are 14 images for each of 200 individuals, a total of 2800 images. All images are colourful and taken against a white homogenous background in an upright frontal position with profile rotation of up to about 180 degrees. Scale might vary about 10% and the original size of each image is 640x480 pixels. All faces are mainly represented by students and staff at FEI, between 19 and 40 years old with distinct appearance, hairstyle, and adorns.  The number of male and female subjects are exactly the same and equal to 100.

Contact: Carlos Eduardo Thomaz, cet@fei.edu.br

FG-NET Database with Facial Expressions and Emotions

An image database containing face images showing a number of subjects performing the six different basic emotions defined by Eckman & Friesen. The database has been developed in an attempt to assist researchers who investigate the effects of different facial expressions.

Citation: Frank Wallhoff; Bjorn Schuller; Michael Hawellek; Gerhard Rigoll: Efficient Recognition of Authentic Dynamic Facial Expressions on the Feedtum Database IEEE ICME, page 493-496. IEEE Computer Society, (2006)

Contact: Frank Wallhoff, frank.wallhoff@jade-hs.de.

Glasgow Unfamiliar Face Database (GUFD)

This database contains three images of 303 identities (each taken using separate cameras), similarity data quantifying perceived similarity between any two identities and 20 images per identity that have been extracted from a video clip for the purpose of familiarisation. 

Contact: Mike Burton, mike.burton@york.ac.uk

Japanese and Caucasian Faces (Emotion and Neutral)

JACFEE: Japanese and Caucasian Facial Expressions of Emotion

  • Consists of 56 color photographs of 56 different individuals who each illustrate one of the seven basic facial expressions of emotion.
  • Fee: $95

JACNeuf: Japanese and Caucasian Neutral Faces

  • Consists of 56 color photographs of the subjects found in the JACFEE collection showing neutral facial expressions.
  • Fee: $95

Japanese Female Facial Expression (JAFFE) Dataset

The Japanese Female Facial Expression (JAFFE) Dataset contains 213 images of 10 Japanese female expressers. 

Citation: Lyons, Michael, Kamachi, Miyuki, & Gyoba, Jiro. (1998). The Japanese Female Facial Expression (JAFFE) Dataset [Data set]. Zenodo. https://doi.org/10.5281/zenodo.3451524

Contact: Michael Lyons, ORCID

Karolinska Directed Emotional Faces (KDEF)

The Karolinska Directed Emotional Faces (KDEF) is a set of totally 4900 pictures of human facial expressions of emotion. The set contains 70 individuals, each displaying 7 different emotional expressions, each expression being photographed (twice) from 5 different angles.

Citation: Goeleven, E., De Raedt, R., Leyman, L., & Verschuere, B. (2008). The Karolinska directed emotional faces: a validation study. Cognition and emotion, 22(6), 1094-1118.

Contact: Emotion Lab at Karolinska Institutet

Labeled Faces in the Wild

The Labeled Faces in the Wild is a database of face photographs designed for studying the problem of unconstrained face recognition. The data set contains more than 13,000 images of faces collected from the web.

Citation: Gary B. Huang, Manu Ramesh, Tamara Berg, and Erik Learned-Miller. Labeled Faces in the Wild: A Database for Studying Face Recognition in Unconstrained Environments. University of Massachusetts, Amherst, Technical Report 07-49, October, 2007.

Contact: Gary Huang, gbhuang@cs.umass.edu

Libor Spacek's Facial Images Databases

This database contains

  • Total number of individuals: 395
  • Number of images per individual: 20
  • Total number of images: 7900
  • Gender:  contains images of male and female subjects
  • Race:  contains images of people of various racial origins
  • Age Range:  the images are mainly of first year undergraduate  students, so the majority of indivuals are between 18-20 years old but some older individuals are also present.
  • Glasses: Yes
  • Beards: Yes
  • Image format: 24bit colour JPEG
  • Camera used: S-VHS camcorder
  • Lighting: artificial, mixture of tungsten and fluorescent overhead

Makeup Datasets

Makeup Datasets contain four datasets of female face images assembled for studying the impact of makeup on face recognition.

YouTube Makeup (YMU)

  • 151 subjects, specifically Caucasian females, from YouTube makeup tutorials, before and after the application of makeup. There are four shots per subject: two shots before the application of makeup and two shots after the application of makeup.
  • Citation: Dantcheva, C. Chen, A. Ross, "Can Facial Cosmetics Affect the Matching Accuracy of Face Recognition Systems?," Proc. of 5th IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), (Washington DC, USA), September 2012.
  • Citation: C. Chen, A. Dantcheva, A. Ross, "Automatic Facial Makeup Detection with Application in Face Recognition," Proc. of 6th IAPR International Conference on Biometrics (ICB), (Madrid, Spain), June 2013.

Virtual Makeup (VMU)

  • VMU (Virtual Makeup): face images of Caucasian female subjects in the FRGC repository (http://www.nist.gov/itl/iad/ig/frgc.cfm) were synthetically modified to simulate the application of makeup on 51 female Caucasian subjects.
  • Citation: A. Dantcheva, C. Chen, A. Ross, "Can Facial Cosmetics Affect the Matching Accuracy of Face Recognition Systems?," Proc. of 5th IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), (Washington DC, USA), September 2012.

Makeup Induced Face Spoofing (MIFS)

  • Dataset consisting of 107 makeup-transformations taken from random YouTube makeup video tutorials. Each subject is attempting to spoof a target identity (celebrity)
  • Citation: C. Chen, A. Dantcheva, T. Swearingen, A. Ross, "Spoofing Faces Using Makeup: An Investigative Study," Proc. of 3rd IEEE International Conference on Identity, Security and Behavior Analysis (ISBA), (New Delhi, India), February 2017.

Messiner African American and Caucasian Male Sets

These sets contain stimuli for use in our studies on cross-racial face recognition and identification.  The sets are available by email request to Dr. Meissner for those seeking to conduct research on face identification.  Our stimuli currently include African American and Caucasian male faces in two poses (smiling w/ casual clothing and non-smiling with burgundy sweatshirt). 

Citation: Meissner, C. A., Brigham, J. C., & Butz, D. A. (2005). Memory for own‐and other‐race faces: A dual‐process approach. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition, 19(5), 545-567.

Contact: Christian Meissener cmeissner@utep.edu

Montreal Set of Facial Displays of Emotion (MSFDE)

Montreal Set of facial displays of emotion (MSFDE) consists of emotional facial expressions by men and women of European, Asian, and African descent. Each expression was created using a directed facial action task and all expressions were FCAS coded to assure identical expressions across actors.

The set contains expressions of happiness, sadness, anger, fear, disgust, and embarrassment as well as a neutral expression for each actor.

Contact: Social Psychophysiology Laboratory, Université du Québec à Montréal

MORPH Database (Academic)

The Academic MORPH database (non-commercial) was collected over a span of 5 years with numerous images of the same subject (longitudinal).This is not a controlled collection (i.e., it was collected in real-world conditions). The dataset also contains metadata in the form of age, gender, and race. The database has 55,134 images of 13,618 subjects.

Contact: Available for purchase

MMI Facial Expression Database

The MMI Facial Expression Database is an ongoing project, that aims to deliver large volumes of visual data of facial expressions to the facial expression analysis community. The database consists of over 2900 videos and high-resolution still images of 75 subjects.

Citation: Valstar, M., & Pantic, M. (2010, May). Induced disgust, happiness and surprise: an addition to the mmi facial expression database. In Proc. 3rd Intern. Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect (p. 65).

Contact: mmi_face_db@mahnob-db.eu

MR2 Face Database

The MR2 is a multi-racial, mega-resolution database of facial stimuli, created in collaboration with the psychologist Kurt Gray and the photographer Titus Brooks Heagins.  It contains 74 full-color images of men and women of European, African, and East Asian descent.

Citation: Strohminger, N., Gray, K., Chituc, V., Heffner, J., Schein, C., and Heagins, T.B. (in press). The MR2: A multi-racial mega-resolution database of facial stimuli. Behavior Research Methods.

Contact: Nina Strohminger, humean@wharton.upenn.edu

MUCT Face Database

The MUCT Face Database consists of 3755 faces with 76 manual landmarks. The database was created to provide more diversity of lighting, age, and ethnicity than currently available landmarked 2D face databases.

Citation: Milborrow, S., Morkel, J., & Nicolls, F. (2010). The MUCT landmarked face database. Pattern recognition association of South Africa, 201(0).

Contact: Stephen Milborrow, milbo@sonic.net

NimStim Set of Facial Expressions

The NimStem Set of Facial Expressions   is a broad dataset comprising of 672 images of naturally posed photographs by 43 professional actors (18 female, 25 male) ranging from 21 to 30 years old. Actors from a diverse sample were chosen to portray emotional expressions within this dataset. To be precise, the actors were African-American (N = 10), Asian-American (N = 6), European-American (N = 25), Latino-American (N = 2). The images contained in this dataset include eight emotional expressions, namely: neutral, angry, disgust, surprise, sad, calm, happy, and afraid. Both open and closed mouth versions were provided for all emotional expressions, with the exception of surprise (only open mouth provided) and happy (high arousal open mouth/exuberant provided).

Citation: Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., ... & Nelson, C. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3), 242-249.

Contact: Nim Tottenham, nlt7@columbia.edu

Oslo Face Database

The Oslo Face Database consists of ~200 male and female faces of neutral expression with three gaze directions: left, center and right. The photos were taken in 2012 of students from the University of Oslo.

Oulu-CASIA NIR&VIS Facial Expression Database

Oulu-CASIA NIR&VIS facial expression database contains videos with the six typical expressions (happiness, sadness, surprise, anger, fear, disgust) from 80 subjects captured with two imaging systems, NIR (Near Infrared) and VIS (Visible light), under three different illumination conditions: normal indoor illumination, weak illumination (only computer display is on) and dark illumination (all lights are off).

Citation: Zhao, G., Huang, X., Taini, M., Li, S. Z., & PietikäInen, M. (2011). Facial expression recognition from near-infrared videos. Image and Vision Computing29(9), 607-619.

Contact: Guoying Zhao, guoying.zhao@oulu.fi

Pictures of Facial Affect (POFA)

The POFA collection consists of 110 photographs of facial expressions that have been widely used in cross-cultural studies, and more recently, in neuropsychological research. All images are black and white.  A brochure providing norms is included with the collection. It is important to  note that these images are not identical in intensity or facial configuration.

Contact: Paul Elkman

Psychological Image Collection at Sterling (PICS)

The Psychological Image Collection at Stirling (PICS) contains two databases of face images. 

Stirling/ESRC 3d face database

  • 45 male and 54 female sets

2D face sets

  • 9 collections containing hundreds of images

Citation: varies

Contact: Peter Hancock, pjbh1@stir.ac.uk

Radboud Faces Database

The Radboud Faces Database (RaFD) is a set of pictures of 67 models (including Caucasian males and females, Caucasian children, both boys and girls, and Moroccan Dutch males) displaying 8 emotional expressions. The RaFD in an initiative of the Behavioural Science Institute of the Radboud University Nijmegen, which is located in Nijmegen (the Netherlands), and can be used freely for non-commercial scientific research by researchers who work for an officially accredited university.

RADIATE Emotional Face Stimulus Set

RADIATE is an open-access face stimulus set of 1721 racially diverse expressions is described. Sixteen different emotions in color and in black and white versions are included. (scroll down for link to the dataset)

Citation: Conley, M. I., Dellarco, D. V., Rubien-Thomas, E., Cohen, A. O., Cervera, A., Tottenham, N., & Casey, B. J. (2018). The racially diverse affective expression (RADIATE) face stimulus set. Psychiatry research.

Sheffield Face Database

The Sheffield Face Database (previously UMIST) consists of 564 images of 20 individuals (mixed race/gender/appearance). Each individual is shown in a range of poses from profile to frontal views – each in a separate directory labelled 1a, 1b, … 1t and images are numbered consecutively as they were taken. The files are all in PGM format, approximately 220 x 220 pixels with 256-bit grey-scale.

Citation: Wechsler, H., Phillips, J. P., Bruce, V., Soulie, F. F., & Huang, T. S. (Eds.). (2012). Face recognition: From theory to applications (Vol. 163). Springer Science & Business Media.

Contact: Laboratory of Vision Engineering (LoVE), University of Lincoln

Todorov Synthetic Faces Databases

Several databases of computer-generated synthetic faces.

Citations: varies

Contact: Alexander Todorov, University of Chicago

Database 1

  • 300 randomly generated faces parametrically manipulated to vary on their perceived value on social dimensions such as trustworthiness and dominance. These faces were generated by data-driven computational models.

Database 2

  • 525 faces manipulated on face shape: 25 (face identities) x 3 (trait dimensions: perceived dominance, threat, and trustworthiness) x 7 (parametric face manipulations, ranging from -3 to +3SD with a step of 1SD).

Database 3

  • 490 faces manipulated on face shape and orthogonally on perceived trustworthiness and dominance: 10 (face identities) x 7 (parametric face manipulations on perceived dominance, ranging from -3 to +3SD with a step of 1SD) x 7 (parametric face manipulations on perceived trustworthiness, ranging from -3 to +3SD with a step of 1SD).

Database 4

  • 3,675 faces manipulated on face shape and reflectance: 25 (face identities) x 7 (trait dimensions: perceived attractiveness, competence, dominance, extroversion, likability, threat, and trustworthiness) x 7 (parametric face manipulations, ranging from -3 to +3SD with a step of 1SD) x 3 (face race: Asian, Black, White). 

Database 5

  • 13,125 faces manipulated on face shape and reflectance: 25 (face identities) x 7 (trait dimensions: perceived attractiveness, competence, dominance, extroversion, likability, threat, and trustworthiness) x 25 (parametric face manipulations, ranging from -3 to +3SD with a step of 0.25SD) x 3 (face race: Asian, Black, White). 

Database 6

  • 4,000 faces used to build a model of attractiveness. Text files, data files, and python and Matlab scripts are also included

Database 7

  • 1,400 faces manipulated on face shape and reflectance by gender-specific models built by Oh, Dotsch, Porter, & Todorov (2020): 25 (face identities) x 2 (gender models: for males and females) x 2 (trait dimensions: perceived dominance and trustworthiness) x 7 (parametric face manipulations, ranging from -3 to +3SD with a step of 1SD) x 2 (face gender: male and female).

Database 8

  • 350 faces manipulated on perceived competence controlling for attractiveness: 25 (face identities) x 7 (parametric face manipulations, ranging from -3 to +3SD with a step of 1SD) x 2 (models: attractiveness-subtracted and attractiveness-orthogonal). 

UB KinFace

UB KinFace database is used to develop, test, and evaluate kinship verification and recognition algorithms. It comprises 600 images of 400 people which can be separated into 200 groups. Each group is composed of child, young parent and old parent images. Most of images in the database are real-world collections of public figures (celebrities and politicians) from Internet. To the best of our knowledge, it is the first database that contains all children, young parents and old parents for the purpose of kinship verification.

Citation:  Ming Shao, Siyu Xia and Yun Fu, “Genealogical Face Recognition based on UB KinFace Database,” IEEE CVPR Workshop on Biometrics (BIOM), 2011.

Contact: Yun Raymond Fu, yunfu@ece.neu.edu

US Politicians

This database contains 550 photos of US politicians who competed either in a gubernatorial race (248) or in a house race (302). The database also contains the politicians’ perceived competence from their photos, as measured in a forced choice competence judgement of participants unfamiliar with the politicians. As such, these judgments simply indicate perceptions and are in no way indicative of the actual competence of the politicians.

Contact: Alexander Todorov, University of Chicago

Yale Face Database

The Yale Face Database (size 6.4MB) contains 165 grayscale images in GIF format of 15 individuals. There are 11 images per subject, one per different facial expression or configuration: center-light, w/glasses, happy, left-light, w/no glasses, normal, right-light, sad, sleepy, surprised, and wink.

The Yale Face Database B (1GB) contains 5760 single light source images of 10 subjects each seen under 576 viewing conditions (9 poses x 64 illumination conditions).

Citation: varies

Contact: UCSD Computer Vision 

Yonsei Face Database (YFace DB)

The Yonsei Face Database (YFace DB), consists of both static and dynamic face stimuli for six basic emotions (happiness, sadness, anger, surprise, fear, and disgust), and to test its validity. The database includes selected pictures (static stimuli) and film clips (dynamic stimuli) of 74 models (50% female) aged between 19 and 40.

Citation: Chung, K. M, Kim, S.J., Jung, W. H., & Kim, V. Y. (2019). Development and Validation of the Yonsei Face Database (Yface DB). Frontiers in Psychology, 10, 2626. https://doi.org/10.3389/fpsyg.2019.02626