The European Union’s home affairs commissioner, Ylva Johansson, has confirmed the Commission is investigating whether or not it broke recently updated digital governance rules when her department ran ...
Apple's much-lauded privacy efforts hit a sour note a few days ago when it announced a new feature intended to protect children by reporting illegal content that has been stored on a user's iCloud ...
Last week, Apple announced three new features that target child safety on its devices. While intentions are good, the new features have not come without scrutiny, with some organizations and Big Tech ...
After a TikTok accusing Patreon of ignoring reports and knowingly profiting off accounts posting child sexual abuse materials (CSAM) attracted hundreds of thousands of views, more TikTokers piled on, ...
The European Commission has again been urged to more fully disclose its dealings with private technology companies and other stakeholders, in relation to a controversial piece of tech policy that ...
Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the ...
A pair of Princeton researchers claim that Apple's CSAM detection system is dangerous because they explored and warned against similar technology, but the two systems are far from identical. Jonathan ...
A user on Reddit says they have discovered a version of Apple's NeuralHash algorithm used in CSAM detection in iOS 14.3, and Apple says that the version that was extracted is not current, and won't be ...
Apple is today announcing a trio of new efforts it’s undertaking to bring new protection for children to iPhone, iPad, and Mac. This includes new communications safety features in Messages, enhanced ...