Apple's iOS 26 beta contains a previously undisclosed FaceTime feature that automatically freezes video calls when nudity is detected, according to developers testing the software released last month. The feature, discovered by X user @iDeviceHelp, pauses both audio and video streams when the app's on-device machine learning identifies someone undressing during a call.
The discovery raises questions about the scope of Apple's content moderation tools and whether safety features designed for children should extend to adult users without explicit consent.
When FaceTime detects nudity, users see a warning message: "Audio and video are paused because you may be showing something sensitive. If you feel uncomfortable, you should end the call"12. Users can then choose to resume the call or end it entirely.
The feature appears to be active for all accounts in the current beta, not just child profiles as initially intended, according to reports from 9to5Mac3. Apple designed the tool as part of its Communication Safety suite, which the company announced would "intervene when nudity is detected in FaceTime video calls" during its June developer conference2.
The feature is disabled by default and must be manually enabled in FaceTime settings under "Sensitive Content Warning," according to Engadget1. However, multiple tech publications reported the feature triggered for adult users during testing24.
Apple maintains that all nudity detection occurs locally on users' devices, with no data transmitted to the company's servers. "Apple doesn't receive an indication that nudity was detected and doesn't get access to the photos or videos as a result," the company states on its Communication Safety support page1.
The on-device processing aims to address privacy concerns, but questions remain about why the feature affects adult accounts when it was designed specifically for child safety, according to Beebom2. Some developers speculate this could be a bug in the early beta software3.
The FaceTime feature extends Apple's existing Communication Safety tools, which previously focused on detecting nude images in Messages and Photos. The company expanded these protections following pressure from child safety advocates and lawmakers concerned about online exploitation.
Features in developer betas often change before public release, and Apple has not confirmed whether the nudity detection will remain active for adult accounts in the final iOS 26 version expected this fall1.