Somewhat offtopic, but the subject caused me to make a disappointing realization:
The most sure way for someone to obtain child porn images that won't trigger a hit in these known child-abuse-image databases would be for someone to take new photos.
Is there a reason we can be confident that these databases aren't creating a market for novel child abuse at a greater rate than they're taking abusers out of the community? Esp since presumably abusers are only a (small?) subset of people who view these images?
The target of NCMEC's database is stopping the distribution, and hopefully encouragement of CSAM. Besides how the distribution can negatively impact the mental state of victims, distribution and more eyes on it might cause more people already on barely-legal parts of the internet (ie. qanon message boards) to learn about ways to obtain CSAM and become part of the market of buyers for such material. If you stop that, then the population of people looking for CSAM shrinks and makes it less economical for criminal enterprises to target children specifically for the production of the illegal material.
> The target of NCMEC's database is stopping the distribution, and hopefully encouragement of CSAM.
But potentially with the unintended consequence of encouraging the creation of more child abuse material, in order to evade the detection of recognized images.
See this other HN thread - https://news.ycombinator.com/item?id=28091750 - they make it a 'perceptual hash' instead of a bytestream hash since they know a lot can and will be done to try to evade PhotoDNA and NeuralHash.
The most sure way for someone to obtain child porn images that won't trigger a hit in these known child-abuse-image databases would be for someone to take new photos.
Is there a reason we can be confident that these databases aren't creating a market for novel child abuse at a greater rate than they're taking abusers out of the community? Esp since presumably abusers are only a (small?) subset of people who view these images?