Should Apple be able to scan devices for child sex abuse images?
Apple has announced details of a system that could find child sexual abuse material (CSAM) on US customers' devices.
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.
Apple said that if a match is found, a human reviewer will then assess and report the user to legal authorities.
However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.
Experts worry that the technology could be used by authoritarian governments to spy on its citizens.
Ìý
Sharon Bradford Franklin is from the Washington-based Center for Democracy and Technology - and co-director of the Security & Surveillance Project. They are opposed to the changes being proposed by Apple.
(A woman types on an Apple laptop. Credit: PA)
Duration:
This clip is from
More clips from Newshour
-
Chilean artist Paz Errazuriz on documenting the Pinochet regime
Duration: 06:55
-
Marina Tabassum on designing the 2025 Serpentine Pavilion
Duration: 06:37
-
The 2025 International Booker Prize winners
Duration: 06:24