I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First. ...Full Bio
I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First.
Success story of Haptik
1010 days ago
Who is afraid of automation?
1010 days ago
What's happening in AI, Blockchain & IoT
1011 days ago
3 million at risk from the rise of robots
1011 days ago
Artificial Intelligence Timeline: Infographic
Controversial AI software that detects Gay People from photos debunked
There is a controversial AI software which researchers claimed was capable of determining if someone is gay by looking at the shape of their face. It has been debunked now.
Experts were of the opinion that this computer program, developed by Stanford University, is unable to determine the sexuality simply by looking at the facial features of a photo.
Instead, they clarified that the AI relies on patterns of how heterosexual & homosexual people take selfies to make its decision.
That includes superficial details like the amount of makeup and facial hair on show, as well as different preferences for the type of angles used to take the shots.
Critics slammed the software when it first emerged in September, saying it could be used to 'out' men and women currently in the closet.
The methods used to create the software also caused outrage, being limited to a predominantly young, white section of the LGBT community.
Rsearchers from Google and Princeton University looked at the data used to create the original Stanford software, to make the findings.
The AI analysed 35,326 images of men and women from a US dating website, who had all declared their sexuality on their profiles.
The Stanford team claimed their software, which was able to correctly identify a man's sexuality 91 per cent of the time and a woman's 71 per cent, could detect subtle differences in facial structure that the human eye struggles to pick out.
Google AI experts Blaise Aguera y Arcas and Margaret Mitchell, joined by Princeton social psychologist Alexander Todorov, studied the composite images generated by the AI as average representations of the faces of heterosexual and homosexual men and women.
They found more obvious surface level differences, like the presence of glasses, which stood out in the images.
They conducted a survey of 8,000 Americans using Amazon's Mechanical Turk crowdsourcing platform to independently confirm these patterns.
Further study suggested that it was these kind of variables the AI was able to pick up on, rather than any physiological differences.
Presenting their findings in a Medium blog post, its authors wrote: '[Stanford researchers] assert that the key differences are in physiognomy, meaning that a sexual orientation tends to go along with a characteristic facial structure.
'However, we can immediately see that some of these differences are more superficial.
For instance, the average straight woman generally appear to wear eyeshadow, while the average lesbian does not.
Heterosexual men are likely to take selfies from slightly below, which has the effect of enlarging the chin, shrinking the forehead, shortening the nose & attenuating the smile.
There are obvious differences between lesbian or gay & straight faces which in selfies relate to presentation, grooming & lifestyle which is a difference in culture and not facial structure.