lauantaina, huhtikuuta 20, 2019

Homoista ja lesboista - sen näkee naamasta?

Olen viime aikoina opiskellut työnantajan ajalla tekoälyä tai pikemmin aihetta data-analyysi - voimme käyttää 10% ajasta vapaasti itsemme kehittämiseen. Data-analyysin tyypillinen sovellus on automatisoida pankin lainapäätös, kun tiedämme tarpeeksi dataa asiakkaasta. Data-analyysiä voi toki käyttää moneen muuhunkin - tämän blogin kannnalta ehkä kiintoisampaan aiheeseen. 

Seksuaalisen orientaation pystyy vuonna 2017 julkaistun tutkimuksen mukaan melko hyvin selvittämään kasvokuvien perusteella. Vastaavia teknologioita voi käyttää varmasti laajemmin missä tahansa "Orwell 1984 2.0 -hankkeessa" eli yksityisyyden nakertamisessa. Teknologiasta saattavat hyötyä vaikkapa pankit ja vakuutusyhtiöt, jotka haluavat arvioida asiakkaansa riskitasoa jättää laina maksamatta tai joutua onnettomuuten, tai ihmiset, jotka haluavat arvioida potentiaalista aviopuolisoaan vaikkapa sen suhteen onko tämä luotettava kumppani.

Lainaan Economist-lehteä

MODERN artificial intelligence is much feted. But its talents boil down to a superhuman ability to spot patterns in large volumes of data. Facebook has used this ability to produce maps of poor regions in unprecedented detail, with an AI system that has learned what human settlements look like from satellite pictures. Medical researchers have trained AI in smartphones to detect cancerous lesions; a Google system can make precise guesses about the year a photograph was taken, simply because it has seen more photos than a human could ever inspect, and has spotted patterns that no human could.

AI’s power to pick out patterns is now turning to more intimate matters. Research at Stanford University by Michal Kosinski and Yilun Wang has shown that machine vision can infer sexual orientation by analysing people’s faces. The researchers suggest the software does this by picking up on subtle differences in facial structure. With the right data sets, Dr Kosinski says, similar AI systems might be trained to spot other intimate traits, such as IQ or political views. Just because humans are unable to see the signs in faces does not mean that machines cannot do so. 

The researchers’ program, details of which are soon to be published in the Journal of Personality and Social Psychology, relied on 130,741 images of 36,630 men and 170,360 images of 38,593 women downloaded from a popular American dating website, which makes its profiles public. Basic facial-detection technology was used to select all images which showed a single face of sufficient size and clarity to subject to analysis. This left 35,326 pictures of 14,776 people, with gay and straight, male and female, all represented evenly 

The images were then fed into a different piece of software called VGG-Face, which spits out a long string of numbers to represent each person; their “faceprint”. The next step was to use a simple predictive model, known as logistic regression, to find correlations between the features of those faceprints and their owners’ sexuality (as declared on the dating website). When the resulting model was run on data which it had not seen before, it far outperformed humans at distinguishing between gay and straight faces.

When shown one photo each of a gay and straight man, both chosen at random, the model distinguished between them correctly 81% of the time. When shown five photos of each man, it attributed sexuality correctly 91% of the time. The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five. In both cases the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61% of the time for men, and 54% of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.

... The study has limitations. Firstly, images from a dating site are likely to be particularly revealing of sexual orientation. The 91% accuracy rate only applies when one of the two men whose images are shown is known to be gay. Outside the lab the accuracy rate would be much lower... 

However, when asked to pick out the ten faces it was most confident about, nine of the chosen were in fact gay. If the goal is to pick a small number of people who are very likely to be gay out of a large group, the system appears able to do so. The point is not that Dr Kosinski and Mr Wang have created software which can reliably determine gay from straight. That was not their goal. Rather, they have demonstrated that such software is possible.
Muita artikkeleita Economist-lehdestä kasvojen tunnistukseen yms. liittyen alla:

1. What machines can tell from your face

3 kommenttia:

Sami kirjoitti...

Asia on kokolailla monimutkaisempi; https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477

J. Haakana kirjoitti...

Olipas hyvä artikkeli!

Jukka Aakula kirjoitti...

Kiitos, J.