What do you, our readers, think about this topic? You can express your thoughts in the Comments section.
NEW IPHONE FEATURE WILL PREVENT CHILD ABUSE
Apple continues to develop new features to ensure the security and privacy of its users. According to news that emerged today, Apple will offer a feature that will prevent child abuse with the iOS 18.2 update. The innovation in question is being tested in Australia in the iOS 18.2 beta update. So, what are the details of the innovation in question? Let's take a look at the news together.
iOS 18.2 update will prevent child abuse
Apple often says it cares about the privacy and security of its users, but the company has recently been criticized for allegedly not taking Child Sexual Abuse Material (CSAM) seriously enough within its services, prompting Apple to begin testing a feature that will block child exploitation in its iOS 18.2 update.
The new feature will automatically detect photos and videos containing nudity in iMessage, AirDrop, FaceTime, and Photos apps from iPhone users under the age of 13. When iPhone detects images containing nudity, a window will pop up for users to report them.
Users will send a report to Apple through the window. When Apple receives a notification, it will forward the necessary information to authorities. When the alert appears, the user's device will create a report that includes any offensive material, messages sent immediately before or after the material, and contact information for both accounts. Users will also be given the option to fill out a form explaining what happened.
The tech giant will review the content and take action after receiving the report. The company will block sending messages via iMessage and report the content it deems necessary to authorized persons.
The feature is currently being tested in Australia on iOS 18.2 beta. It is expected to be available once iOS 18.2 is available to everyone.