The European Parliament and the Council are discussing a regulation that would ensure all messages get scanned for child pornography and which is perceived by many as a threat to basic freedoms
Hundreds of academics and engineers and non-profit organizations such as Reporters Without Borders, as well as the Council of Europe, believe that the Child Sexual Abuse Regulation (CSAR) would mean sacrificing confidentiality on the internet, and that this price is unaffordable for democracies.
The European Data Protection Supervisor, who is preparing a statement on this for late October, has said that it could become the basis for the de facto widespread and indiscriminate scanning of all EU communications. The proposed regulation, often referred to by critics as Chat Control, holds companies that provide communication services responsible for ensuring that unlawful material does not circulate online. If, after undergoing a risk assessment, it is determined that they are a channel for pedophiles, these services will have to implement automatic screening.
The mastermind behind the billboards and newspaper exhortations calling on Apple to detect pedophile material on iCloud is, reportedly, a non-profit organization called Heat Initiative, which is part of a crusade against the encryption of communications known in the U.S. as Crypto Wars. This movement has gone from fighting against terrorism to combating the spread of online child pornography to request the end of encrypted messages, the last great pocket of privacy left on the internet. “It is significant that the U.S., the European Union and the United Kingdom are simultaneously processing regulations that, in practice, will curtail encrypted communications. It seems like a coordinated effort,” says Diego Naranjo, head of public policies at the digital rights non-profit EDRi.
Sadly there is no perfect desition, I would sacrifice privacy to protect child . But There should be always multiple options, not just one . I really hope some genius can find a tech solution were people privacy can be protected and we can also fight atrocities like child abuse.
This problem is not solved by a technical solution, a backdoor.
It is solved by adding manual reporting functionality to communication systems, and then teaching children to report any such atrocities.
If they cannot decide for themselves, that only means one thing: they don't know what to do with it, or they don't trust their caretakers (parents, school personnel and similar parties), which are the actual problems to be solved.
Obviously children should not have unsupervised access to internet connected devices until they are capable of deciding for themselves.
Are you of the kind that presses youtube into their kids face to just have them calm down?
At the appropriate age, you start teaching them how to appropriately use these devices, and also to let you know if they see something wrong, and when that happens you teach them what to do in that case.
A random person has sent a message invitation? If you nor your children know them, you tell them to decline it.
Someone your kids know has sent them something inappropriate? You tell your kids that it is inappropriate, what to do with it (use the platforms report function, block the person, etc), and to report to you if it happens again.
In multiple cases it is even better if you talk about some of the dangers before the bad thing happens.
That if a website wants access to the cookies, they should always reject what is possible, and that if a website wants them to register, to ask you about it first.
Obviously, they won't know by themselves, until you teach them. As a parent this is your job, they won't do that in the school instead of you.
This all should be the most basic things for every responsible parent.