Press "Enter" to skip to content

Apple Built a Backdoor with the New Child Abuse Detection Tool

Apple plans to implement new features on its platforms to prevent Child Sexual Abuse Material (CSAM) have sparked a lot of debate. The startup is attempting to pioneer a solution to a problem that has bedevilled law enforcement officials and technology corporations alike in recent years: the large-scale, ongoing dilemma of CSAM proliferation on key internet platforms.

As recently as 2018, internet companies confirmed the presence of up to 45 million photographs and videos that might be considered child sex abuse material—a staggering figure. While the crisis is real, opponents worry that Apple new features—which include algorithmic scanning of users’ devices and messages—are a breach of privacy and, more worryingly, might be repurposed to search for material other than CSAM in the future.

Apple Built a Backdoor with the New Child Abuse Detection ToolThis move could lead to new forms of pervasive surveillance as well as a potential workaround for encrypted communications, which is one of privacy’s last, best hopes. We need take a quick look at the specifics of the proposed modifications to appreciate these concerns. To begin, the business will release a new tool that will scan images uploaded to iCloud from Apple devices for evidence of child sex abuse content.

According to an technical paper, the new functionality assesses whether photographs on a user’s iPhone match known hashes, or unique digital fingerprints, of CSAM using a neural matching function called NeuralHash. The National Center for Missing and Exploited Children has developed a vast database of CSAM pictures, which it compares to the images shared with iCloud. If a sufficient number of photographs are found, they are reported for inspection by human operators, who then notify NCMEC.Some people have voiced anxiety that their phones may include photos of their own children in the bathtub, running nude through a sprinkler, or anything similar. You don’t have to worry about that, according to Apple.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *