Pirated DataViz Books Used to Train AI

For all of my fellow book authors, you can now positively determine if pirated copies of your book(s) were illegally used in the training of AI models. Use the search tool here: https://secure.anthropiccopyrightsettlement.com/lookup
Doing a quick check, I confirmed that the Cool Infographics book is included, as well as many of the other data visualization and infographics books that I have in my library. A quick check of many other popular dataviz authors confirms that pirated books by Alberto Cairo, Steve Wexler, Cole Nussbaumer Knaflic, Nathan Yau, Edward Tufte, Stephen Few, Garr Reynolds, Jonathan Schwabish, RJ Andrews, Manuel Lima, Andy Kirk, Noah Iliinsky, Scott Berinato, Ben Jones, Naomi Robbins, and David McCandless are all included in the training database.
The double-edged sword is that the inclusion of these books makes the AI models really good at giving advice about data visualizations, charts, and infographics. The models learned from the best. The downside is no compensation or credit is given to any of these authors who spent years of work building their knowledge and writing comprehensive books.
As part of the class action Anthropic Settlement from Bartz v. Anthropic in September 2025, an online searchable Works List of all books included in the lawsuit is now available. Authors can search the database to positively determine which of their book(s) were used to train the Anthropic models. Of course, not just dataviz books, but thousands of fiction and non-fiction books.
Historically, there hasn’t been an absolute list of pirated books used to train the major AI platforms. Allegedly, the BOOKS3 and LIBGEN databases online were lists of pirated books used by chatGPT, Meta and others to train their AI models. However, there was never any official confirmation by any of the AI companies about which specific pirated books they used.
This new development from the Anthropic settlement, is that they have made public a searchable database of the books covered under the lawsuit that were used in the training of their AI models. The lawsuit settlement forced them to be specific about which books were used. Although the search tool is convenient, it falls short of actually publishing the full list.
As an author, if you wish to be included in the settlement, you can file a claim on the settlement website: https://www.anthropiccopyrightsettlement.com