Many types of surveillance exist on anything from smartphones to IoT devices, but most of them are not as ubiquitous and intrusive as Client Side Scanning (CSS) for Child Sexual Abuse Material Detection (CSAMD). Apple proposed to scan their software and hardware for such imagery. While CSAMD was since pushed back, the European Union has decided to propose forced CSS to combat and prevent child sexual abuse via a new regulation, and deliberately weaken encryption on all messaging services. CSS represents mass surveillance of personal property, in this case pictures and text, proposed by Apple without proper consideration of privacy, cybersecurity and legal consequences. We first argue why CSS should be limited or not used at all, and briefly discuss some clear issues with the way pictures cryptographically are handled and how the CSAMD claims to preserve privacy. Afterwards, in the second part, we analyse the possible human rights violations which CSS in general can cause within the regime of the European Convention on Human Rights. The focus is the harm which the system may cause to individuals, and we also comment on the proposed European Union Regulation. We find that CSS by itself is problematic because they can rarely fulfil the purposes which they are built for. This comes to down to how software is not "perfect", as seen with antivirus software. Secondarily, the costs for attempting to solve issues such as CSAM far outweigh the benefits, and this is not likely to change regardless of how the technology develops. We furthermore find the CSAMD as proposed is not likely to preserve the privacy or security in the way of which it is described in Apple's own materials. We also find that the CSAMD system and CSS in general would likely violate the Right to a Fair Trial, Right to Privacy and Freedom of Expression. This is because the pictures could have been obtained in a way that could make any trial against a legitimate perpetrator inadmissible or violate their right for a fair trial, the lack of any safeguards to protect privacy on national legal level, which would violate the Right for Privacy, and it is unclear if the kind of scanning which would be done here could pass the legal test which Freedom of Expression requires, making it likely violate this as well. Finally, we find significant issues with the proposed Child Abuse Regulation. This is because it relies on techno-solutionist arguments without substance, disregards conventional knowledge on cybersecurity and does not justify the independence and power of a "centre" to help solve the problem.7 22062/07, Layijov v. Azerbaijan, § 64. 8 35394/97, Khan v. the United Kingdom, § 34. 9 59696/00, Khudobin v. Russia, § 128. 10 47074/12, Grba v. Croatia. 11 40495/15, Akbay and Others v. Germany. 12 74420/01, Ramanauskas v. Lithuania. 13 67537/01, Shannon v. the United Kingdom 14 40660/08 and 60641/08, Von Hannover v. Germany (no. 2), § 95.