Apple Will Now Monitor iPhone and iCloud Pictures for Child Abuse The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
By Emily Rella
Our biggest sale — Get unlimited access to Entrepreneur.com at an unbeatable price. Use code SAVE50 at checkout.*
Claim Offer*Offer only available to new subscribers
Opinions expressed by Entrepreneur contributors are their own.
Apple Inc. announced Thursday that it's implementing a new system to check iPhone images before they're uploading to the iCloud storage service to be sure no images match with known images of child sexual abuse.
During a press conference, Apple explained that the service will turn device images into an unreadable series of hashes or complex numbers, which will be matched against a database of hashes from the National Center for MIssing and Exploited Children.
According to a note on Apple's website, this is just one part of the brand's new child safety initiative: "First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple."
Related: Is it Time to Take a Bite of Apple Stock?
The Siri and Search functions in iOS will also play a part in combatting child abuse, as they'll provide parents and children information and help in unsafe situations and intervene when users attempt to search for abuse-related topics.
Apple, which markets itself as a secure and private option for consumers, was careful to highlight that these steps are not meant to infringe on privacy.
The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
Related: Apple Co-Founder Steve Wozniak Says Bitcoin Is Better Than Gold