Why Apple's Image Scanning Technology Isn't Private at All

Why Apple's Image Scanning Technology Isn't Private at All

HomeHow to, TechWhy Apple's Image Scanning Technology Isn't Private at All

Apple recently introduced a new technology to detect child pornography, but the technology has received more criticism than praise from the privacy community.

Why Apple Wants to Scan Your Photos

While Apple has previously been hailed as one of the few Big Tech companies that actually cares about user privacy, the new CSAM scanning technology introduced last week throws a spanner in the works. Experts say that while Apple promises user privacy, it ultimately puts all Apple users at risk.

“Apple is on a dangerous slippery slope; they have created a tool that is vulnerable to government backdoors and malicious abuse,” Farah Sattar, founder and security researcher at DCRYPTD, said in an email interview with Lifewire.

The new technology works in two ways: First, it scans an image before it’s stored in iCloud. If an image meets CSAM’s criteria, Apple receives the cryptographic voucher data. The other part uses on-device machine learning to identify and blur sexually explicit images that children receive through Messages.

Tagged:
Why Apple's Image Scanning Technology Isn't Private at All.
Want to go more in-depth? Ask a question to learn more about the event.