Rajendra

I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing ...

Full Bio 
Follow on

I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing

This asset class turned Rs 1 lakh into Rs 625 crore in 7 years; make a wild guess!
1228 days ago

Artificial intelligence is not our friend: Hillary Clinton is worried about the future of technology
1232 days ago

More than 1 lakh scholarship on offer by Google, Know how to apply
1233 days ago

Humans have some learning to do in an A.I. led world
1233 days ago

Human Pilot Beats Artificial Intelligence In NASA's Drone Race
1234 days ago

Google AI can create better machine-learning code than the researchers who made it
79407 views

More than 1 lakh scholarship on offer by Google, Know how to apply
67860 views

Rise of the sex robots: Life-like doll goes on sale for 15,000 pound
54174 views

13-year-old Indian AI developer vows to train 100,000 coders
49203 views

What is Deep Learning and Neural Network
46497 views

Facebook's unorthodox new revenge porn defense is to upload nudes to Facebook

By Rajendra |Email | Nov 19, 2017 | 23973 Views

Facebook is testing a new preemptive revenge porn defense in Australia that may, at first blush, feel counterproductive: uploading your nude photos or videos directly to Messenger. According to the Australia Broadcasting Corporation, Facebook has partnered with the office of the Australian government's e-Safety Commissioner, which works primarily to prevent the online abuse of minors, to develop the new system for combating the nonconsensual sharing of explicit media.

By uploading the images or videos you fear may be shared in the future in an attempt to shame or harass you online, Facebook can digitally "hash" the media, effectively giving it a digital footprint. This allows the social network to track the media using the same artificial intelligence-based technologies it uses in its photo and face matching algorithms, and then prevent it from being uploaded and shared in the future. This works only if you're in possession of the original file, but it would seem to bypass any attempts from a malicious third party to alter the metadata by analyzing and tagging the actual content of the image or video.

Facebook first implemented a similar, although less preemptive, mechanism for preventing the proliferation of revenge porn back in April, with the implementation of a photo-matching system to prevent the spread of images that have already been reported and taken down. The company has also liberally banned accounts for revenge porn activities. But now Facebook seems to be asking users to think ahead and play it safe if they feel particularly vulnerable, which could be the case in a relationship that becomes abusive over time or only after it's ended.

"We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly," e-Safety Commissioner Julie Inman Grant told ABC. "They're not storing the image. They're storing the link and using artificial intelligence and other photo-matching technologies. So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded."

Of course, there are a few concerns here worth mentioning. Although Facebook is using a hashing system to avoid storing the photos or videos directly on its servers, that the company has a bad reputation with regards to privacy and consumer trust means everyday users might think uploading directly to Messenger is the equivalent of posting revenge porn against themselves. And because it's just a test right now, there's no telling whether the system can be easily tricked by altering aspects of the photo, sometimes in subtle and even imperceptible ways, to trick Facebook's filters.

Successful attempts at tricking machine vision systems are well documented. Hackers and researchers alike have been known to use what are called "adversarial images" that use digital manipulations to trick AI algorithms, either by making a facial recognition system think someone looks like someone else, or by forcing a piece of image recognition software to think it's looking at an one object that is in reality just a noisy mess of geometric shapes. It's not far fetched to think Facebook's automatic revenge porn filtering system could be bypassed in similar ways, by inserting enough hidden data in between the lines of an image to make it seem different enough from the source to the eyes of an algorithm.

Source: Verge