Apple has implemented a whole lot of changes in its iOS 18.2 software update, with a new safety feature in Australia allowing children to report nude images and videos being sent to them. The new feature aims to automatically detect any images or videos that may contain nudity through either iMessage, AirDrop, FaceTime, Photos and third-party apps that adopt Apple’s Communication Safety framework. If nudity is detected throughout any of these means, the image or video will then be blurred and resources provided to help the child handle the situation. The automatic detection is made possible through on-device machine learning. When the image is blurred, Apple will flash a warning message on the image asking if the user wants to report it, along with other support options. According to The Guardian, the report is sent to Apple who could then report the messages to police. In the report to Apple, the device will put together a report containing the sensitive images or videos, as well as the communication that was sent immediately before or after the media was sent over. It will also include the contact information from both accounts, with users able to fill out a form describing what has happened.