Main content

Should Apple be able to scan devices for child sex abuse images?

Apple has announced details of a system that could find child sexual abuse material (CSAM) on US customers' devices.

Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found, a human reviewer will then assess and report the user to legal authorities.

However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.

Experts worry that the technology could be used by authoritarian governments to spy on its citizens.
Ìý
Sharon Bradford Franklin is from the Washington-based Center for Democracy and Technology - and co-director of the Security & Surveillance Project. They are opposed to the changes being proposed by Apple.

(A woman types on an Apple laptop. Credit: PA)

Release date:

Duration:

4 minutes