Apple has improved its ability to protect children from erotic photos. The innovation was noticed in the beta version of iOS 18.2 in Australia. reports Guard.
Young users who receive intimate photos will now be able to complain directly to Apple. The company will review the application and contact the police if a violation is confirmed.
The corresponding protection works on built-in apps on iPhone, including iMessage, AirDrop, FaceTime, Contacts, and Phone, as well as some third-party instant messengers. To enable it, you need to go to “Settings” – “Screen Time” – “Contact Security”.
With the release of iOS 18.2, users will be able to fill out the complaint form about what happened. The app will display contact information of the recipient and sender to identify the violator. If the law is broken, Apple will block the offender from sending iMessages and will forward the information to law enforcement as a last resort.
The new system runs on a beta version of iOS 18.2 in Australia, where authorities have ordered Apple and other companies to monitor content related to child abuse and terrorism. The function is expected to work in other regions in the future.
The ability to protect children from intimate photos first appeared on the iPhone with the release of iOS 17. So, if a user under 13 takes a nude body photo, the image is automatically blurred, the parent receives a notification, and the parent’s password is required to view the photo.
Previously reportedHe said Apple is developing an app to combat diabetes.
What are you thinking?
Source: Gazeta

Jackson Ruhl is a tech and sci-fi expert, who writes for “Social Bites”. He brings his readers the latest news and developments from the world of technology and science fiction.