Apple to use new tech to see what photos you have in your iPhone – Times of India

Apple will reportedly deploy a new technology to see what kind of photos you have in the Gallery of your iPhone to identify whether or not you are storing child pornographic content. Apple will be using hash algorithms to check through the photos stored in user’s iPhones and will use photo identification software in the backend to recognise whether or not it looks like child pornography or any other kind of abuse.
Apple will reportedly roll out a “client-side tool for CSAM scanning”. This means your iPhone will automatically download these hash algorithms that will check each and every photo saved in your iPhone to identify whether there’s any illegal content. If the algorithm spots any objectionable content, the iPhone will automatically report it to Apple servers if there are too many of such photos.

The problem is hashing algorithms are not always accurate and it may give false positive reports. Also, while Apple is claiming to detect only illegal child abuse content, it could be possible that Apple may flag other types of content as well. “Apple allows governments to control the fingerprint content database, then perhaps they could use the system to suppress political activism,” as per a report by 9to5 Mac.
Even if Apple stores content on iCloud in an encrypted way, the problem is that Apple also owns the keys to decrypt. This means if Apple is forced by any law enforcement agency then Apple may allow governments to look through all photos of a particular user.
“The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal,” said cryptography and security expert Matthew Green.
Governments across the world have been asking for technologies to bust E2E communications as law enforcement agencies are having a tough time to decrypt such communications. The only hope here is that Apple would not let its systems be misused. “But even if you believe Apple won’t allow these tools to be misused there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review,” added Green.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.