Now, to find out what a person need only look at his picture.
Scientists from Stanford have developed an algorithm that by using deep learning determines the orientation of a person by his photo.
For training we used 35 thousand photos of white men and women of both orientations with a Dating site. The classifier has learned to correctly determine the orientation of a person one photo in 81% of cases for men and in 74% of cases for women.
People coped with this task worse: 61% and 54%, respectively. If the algorithm is to show five photos of one person, the classification accuracy increases to 91% and 83%, respectively.
Face features that the classifier is considered key, include permanent (nose, jaw) and temporal (facial).
The study showed that homosexual men and women more often have atypical for their sex traits and facial expression, as well as vegetation on it. For example, gay men often have a long nose, high forehead and narrow jaw, and homosexual women a wider jaw and niskie foreheads.
This partially confirms the theory that orientation is formed before birth, in the womb, and the reason for the atypical orientation, and atypical facial features, is the lack of your sex hormones or excess hormones of the opposite sex. In other words, if the orientation can be so precisely determined by the person, the more likely gays are born, not made.