NCMEC says that reports of child porn are growing, but that could easily be reports per posting, postings per image, or images per activity. NCMEC just *counts* reports, which are either a member of the public clicking a “report” button or an algorithm finding suspicious content. They acknowledge that a significant part of the rise in from broader deployment of such algorithms.
Good point. I wonder:
Did algorithm deployment expand a lot from 2014 to 2018? (I’m particularly boggled by the 18x increase in reports between 2014 and 2018)
What amount of the increase seems reasonable to explain away by changes in reporting methods?
About half? (i.e. remaining 2014-to-2018 increase to be explained is 9x?)
75%? (i.e. remaining 2014-to-2018 increase to be explained is 4x?)
A major contributor to the observed exponential growth is the rise of proactive, automated detection efforts by ESPs [electronic service providers], as shown in Figure 3 . Since then, reporting by ESPs increased an average of 101% year-over-year, likely due to increasing user bases and an influx of user-generated content. While automated detection solutions help ESPs scale their protections, law enforcement and NCMEC analysts currently contend with the deluge of reports in a non-automated fashion as they are required to manually reviews the reports
Good point. I wonder:
Did algorithm deployment expand a lot from 2014 to 2018? (I’m particularly boggled by the 18x increase in reports between 2014 and 2018)
What amount of the increase seems reasonable to explain away by changes in reporting methods?
About half? (i.e. remaining 2014-to-2018 increase to be explained is 9x?)
75%? (i.e. remaining 2014-to-2018 increase to be explained is 4x?)