The end of cyber-flashing? Instagram is developing a tool to protect users from receiving unsolicited nude photos in their direct messages, Meta confirms
- Instagram is creating a feature that detects nude images sent via direct message
- It automatically covers the photo, but gives the recipient the option to open it
- A Meta spokesperson said that the company will not be able to view the images
- It is hoped the tool will reduce the number of ‘cyber-flashing’ cases
Instagram is developing a tool that can block unsolicited nude photos sent in a direct message (DM), a spokesperson for its parent company Meta has confirmed.
Known as ‘Nudity Protection’, the feature will reportedly work by detecting a nude image and covering it, before giving the user the option of whether to open it or not.
More details are due to be released in the coming weeks, but Instagram claims it will not be able to view the actual images or share them with third-parties.
This has been confirmed by Liz Fernandez, Meta’s Product Communication Manager, who said it will help users ‘shield themselves from nude photos as well as other unwanted messages’.
She told The Verge: ‘We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive.’
Known as ‘Nudity Protection’, the feature will reportedly work by detecting a nude image and then filtering that message from the inbox
It is still in the early stages of development, but will hopefully help to reduce incidents of ‘cyber-flashing’. Cyber-flashing is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby (stock image)
News of the feature was first announced on Twitter by leaker and mobile developer Alessandro Paluzzi.
He said that ‘Instagram is working on nudity protection for chats’, and posted a screenshot of what users may see when opening the feature.
It said: ‘Securely detect & cover nudity. Technology on your device covers photos that may contain nudity in chats. Instagram can’t access the photos.
‘Choose to view photos or not. Photos will stay covered unless you choose to view them.
‘Get safety tips. Learn ways to stay safe if you’re interacting with sensitive photos.
‘Turn on or off anytime. Update in your Settings.’
Ms Fernandez likened the feature to the ‘Hidden Words’ feature on Instagram that was introduced last year.
This allows users to automatically filter messages containing words, phrases and emojis they don’t want to see.
She also confirmed that Nudity Protection will be a voluntary feature that users can turn on and off as they please.
It is still in the early stages of development, but will hopefully help to reduce incidents of ‘cyber-flashing’.
Cyber-flashing is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby.
This could be through social media, messages or other sharing functions such as Airdrop or Bluetooth.
In March, it was announced by UK ministers that men who send unsolicited ‘d**k pics’ will soon face up to two years in jail (stock image)
HOW WILL THE ‘NUDITY PREVENTION’ TOOL WORK?
The new ‘Nudity Prevention’ tool will reportedly work by detecting images that may contain nudity that have been sent to the user over chat.
It will automatically cover the image, and the user can choose whether to view it or not when they open the message.
Instagram will not be able to access the photos, and the user can turn on or off the feature any time.
In March, it was announced that men who send unsolicited ‘d**k pics’ will soon face up to two years in jail.
Ministers confirmed that laws banning this behaviour will be included in the Government’s Online Safety Bill, which is set to be passed in early 2023.
The move will apply to England and Wales – as cyber-flashing has been illegal in Scotland since 2010.
It came after a study from the UCL Institute of Education found that non-consensual image-sharing practices were ‘particularly pervasive, and consequently normalised and accepted’.
Researchers quizzed 144 boys and girls aged from 12 to 18 in focus groups, and a further 336 in a survey about digital image-sharing.
Thirty-seven per cent of the 122 girls surveyed had received an unwanted sexual picture or video online.
A shocking 75 per cent of the girls in the focus groups had also been sent an explicit photo of male genitals, with the majority of these ‘not asked for’.
Snapchat was the most common platform used for image-based sexual harassment, according to the survey findings.
But reporting on Snapchat was deemed ‘useless’ by young people because the images automatically delete.
Furthermore, research by YouGov found that four in ten millennial women have been sent a picture of a man’s genitals without consent.
Men who send unsolicited ‘d*** pics’ may be NARCISSISTS and usually expect to receive ‘something in return’
Men who send other people unsolicited images of their genitals are likely to be more narcissistic and sexist that those who do not, psychologists have found.
Researchers at Pennsylvania State University surveyed over a thousand men to compare the personalities and motivations of those who sent intimate images and those who did not.
Rather than for personal gratification, men who share images of their genitals typically do so hoping to arouse the recipient and get images back in return.
A small minority of participants reported sending the private photos in order to intentionally elicit a negative response from women.
The researchers conclude that the practice can neither be construed as solely sexist or as a positive sexual outlet.
Read more here
Source: Read Full Article