Online dating sites can be daunting for users worried that their bodies leave something to be desired. In order to lure potential matches, people often post profile photos angled just enough to show their actual faces, while misleading viewers about the proportions of their bodies — or cutting off everything below the neck altogether.

Such self-enhancers may soon be out of luck. By analyzing facial features in a simple headshot photo, a new computer vision algorithm developed by researchers at West Virginia University in Morgantown can automatically predict a person's body mass index (BMI).

The study, published in May in the journal Image and Vision Computing, builds on previous research that links facial features to BMI, a measure of body mass calculated either by dividing a person's weight in kilograms by the square of their height in meters, or by dividing the weight in pounds by the square of the height in inches, then multiplying the result by 703.

BMI is commonly used as a physical marker in health care, and is a significant risk factor for conditions associated with obesity, like metabolic syndrome and heart disease. A normal BMI is between 18.5 and 25, while below 18.5 is considered underweight and above 30 is clinically obese.

Based on a database of over 14,500 face photos, Lingyun Wen and Guodong Guo developed a machine vision algorithm that automatically detects seven characteristic ratios that include cheekbone width to jaw width, cheekbone width to upper facial height, perimeter to area, average eye size, lower face to face height, face width to lower face height, and average distance between the eyebrows and eyes.

An illustration of the seven ratios the algorithm uses to predict BMI from a face photo. [Image and Vision Computing]

The resulting computations had an error rate of two to four BMI points — not bad for guessing someone's proportions based only on a mug shot.

The calculations were less accurate in guessing the BMI's of people who were underweight than they were at assessing those of people closer to the average, which the researchers write is because they weren't able to train the algorithm on images of enough people in those categories.

The algorithm could also use more tweaking to assess differences among different ethnicity, age, and sex groups, they write.

"This could be used in smart health applications, relating face images to BMI and associated health risks," Guo said to New Scientist. "Or on online dating sites, for instance, it could help you assess the BMI and state of health of people you might date."

Aside from scoping out the proportions of potential mates and other people on social media apps, the researchers write that the algorithm can be used in practical situations like assessing a criminal suspect's BMI from surveillance photos of faces, which can sometimes lack visual references for height and weight.

Source: WenL, Guo G. A computational approach to body mass index prediction from face images. Image and Vision Computing. 2013.