Automatic gender spoof detection (make-up and mustaches)

Some might take the view that if you use a camera-equipped laptop, tablet, or smartphone, or post your snaps to a social networking site, then various corporate or governmental organisations (or both) might be likely run an automated face-recognition algorithm on your photo(s). If you were to object to such practices, then you might choose to deliberately try and mislead the systems – say, by attempting to disguise (or ‘spoof’) your gender with a drawn-on mustache. Facial recgonition experts worldwide are aware of this, and are taking steps. See the work, for example, of Dr. Antitza Dantcheva, Ph.D, who is a Marie Curie fellow and Post-doctoral researcher in the Spatio-Temporal Activity Recognition Systems (STARS) team at INRIA Méditerranée in Sophia Antipolis, France.

Dr. Dantcheva has co-authored a number of papers which explore the possible effects of such trickery. e.g. ‘Impact of facial cosmetics on automatic gender and age estimation algorithms’, paper 341 in: VISAPP’14, the 9th International Conference on Computer Vision Theory and Applications, January 5-8, 2014, Lisbon, Portugal.

MIGA_data
As part of the project, the team searched the Web and assembled a unique photo resource which they call the Makeup Induced Gender Alteration (MIGA) Dataset, including for example, female subjects with drawn-on mustaches (using Kohl*eyeliner)

“[…] we consider the use of facial cosmetics for (a) gender spoofing where male subjects attempt to look like females and vice versa, and (b) age alteration where female subjects attempt to look younger or older than they actually are. While such transformations are known to impact human perception, their impact on computer vision algorithms has not been studied. Our findings suggest that facial cosmetics can potentially be used to confound automated gender and age estimation schemes.“

They note that:

“While a subject may not use makeup to intentionally defeat the system, it is not difficult to envision scenarios where a malicious user may employ commonly-used makeup to deceive the system. Future work will involve developing algorithms that are robust to changes introduced by facial makeup.“

Further resources: provided by Dr. Dantcheva, The YMU (YouTube Makeup): VMU (Virtual Makeup): and MIW (Makeup in the “wild”): photo datasets (registration required)

*Note: A renowned proponent of Kohl eyeliner.

Also see: Progress in Fake-Finger Thwarting