There was a time when it was possible to defend a child pornography charge by arguing that the images were virtual, as opposed to real. In essence, the argument was that the images were virtual or “fake”, were created by combining materials containing both children and adults, and that the participants in the video or still shots were not really children. Any such defense is becoming increasingly implausible because of the power of Artificial Intelligence, or AI.
An increasing number of child pornography images and videos created with Artificial Intelligence are appearing online. These images and videos may not depict real children, but appear to because they were created with AI tools that are very powerful and increasingly sophisticated. We are now at a point where the AI-generated images are indistinguishable from real ones. Gone are the days when a hand had too many fingers, a background was blurry, or the transition from frame to frame was not smooth. Moreover, some of the new material contains a mix of both real and AI-generated child participants. Viewers of such material exchange it in internet forums, and via social media. Additionally, not all of the material is posted purely for viewing. There have apparently been incidents of children being blackmailed after having their facial images lifted from an online school yearbook, or other innocuous source.
Further, both the sophistication and the volume of images is striking. The National Center for Missing and Exploited Children (“NCMEC”), reports that it received approximately 485,000 reports of AI-generated child pornography in the first half of this year, compared with approximately 67,000 for all of 2024. Amazon, which has an AI tool, reported 380,000 child pornography images and videos in the first half of 2025, whereas OpenAI reported 75,000 incidents. The British Internet Watch Foundation reports having identified almost 1,300 child pornography videos this year globally, compared to 2 in the first have of 2024. An increasing number of tech companies with AI tools are reporting these online discoveries to NCMEC which, in turn, alerts law enforcement. Some of them also have screening tools, as well as warnings that are posted to their users.
At this juncture, the law in this area is unsettled because it has not yet caught up to the technology. Law enforcement has taken the position that federal child pornography laws cover materials generated by AI, even if they do not contain any images of real children. State legislators are also working to criminalize AI-generated child pornography, with between 30 and 40 such statutes enacted in the recent past. However, images generated entirely by AI still present legal challenges. A defendant in a case in Wisconsin Federal District Court challenged one of the charges against him on First Amendment grounds. The Court ruled that “the First Amendment generally protects the right to possess obscene material in the home [if it is not] actual child pornography.” At this juncture, the trial in that case is proceeding based on other charges, which relate to producing and distributing 13,000 images using an image generator. The defendant attempted to share the images with minor via social media, and was reported.
Federal law enforcement has made its position on this issue very clear. The director of the Justice Department’s Criminal Division recently stated that DOJ “views all forms of AI-generated [child pornography] as a serious and emerging threat.” Even though the law is unsettled because of the evolving technology, it is clear that these cases will be prosecuted aggressively.
James S. Friedman, Esq., a criminal defense attorney based in New Brunswick, New Jersey, represents individuals charged with possession or distribution of child pornography in all State and Federal courts in New Jersey. If you have a child pornography charge in any court in New Jersey, contact Mr. Friedman to discuss your case, and learn about your options.