iOS 17 expands protection against unsolicited pictures to all iPhone users – Times of India
Apple‘s Communication Safety feature, intended to safeguard children from accessing explicit images on iMessage, will now be extended to cover adult users and other forms of communication, including videos.
What is Communication Safety feature
The Communication Safety in Messages feature uses on-device machine learning to automatically blur unsolicited images in iMessages, so the same will be used for videos as well. Apple says that all the image and video processing will be performed on the device, ensuring privacy that even the company could not access.
As of now, the Communication Safety tool is only available for minors as an opt-in feature available in Apple’s Family Sharing system. With this feature enabled, it can identify any images that may contain nudity when a child is sending or receiving them. If detected, the child is notified and the image is blurred before it is viewed on their device. Additionally, the child is provided with useful resources and the option to message a trusted adult for further assistance.
With iOS 17, the Communication Safety will help safeguard children from accessing or sharing photos with nudity through AirDrop, new Contact Posters, FaceTime messages, and when browsing their photo library using Photo Picker. This feature also extends to video content, where it can detect nudity.
The “Sensitive Content Warning” protects all users, whether adults or minors, from receiving unsolicited pictures.
If an image or video contains nudity, a pop-up message will appear, warning the user, regardless of their age, reading,“Naked photos and videos show the private body parts that are usually covered by underwear or bathing suits,” the pop-up explains. “It’s not your fault, but naked photos and videos can be used to hurt you.” The message will ask if they want to view the content, but will also offer reassurance and helpful guidance on staying safe.
For all the latest Technology News Click Here