AI models used to generate text and images scrape websites for training data. "Do not scrape" disclaimers exist, as do various "do not scrape" lists, but they are largely unenforceable. Since AI tools use elements from training data to generate images, poisoning images is one way to ensure that one's work cannot be used in this way without permission. This also stops AI tools from directly using one's art to generate new images that imitate one's artistic style.
When generative AI models try to replicate images they "see" them differently than humans do. This means that images can be modified to thwart AI's attempts to recreate them while still appearing (mostly) the same to humans. Glaze and Nightshade are two tools that do this. Glaze is a defensive tool that modifies images so that AI cannot be used to copy one's artwork or artistic style. Nightshade is an offensive tool that modifies images into bad training data that will gradually erode the usefulness of AI over time. Images on my site have been run through one or both of these tools.
If you are interested in poisoning your own art please check out the FAQs for Glaze or Nightshade. Glaze runs more quickly and uses less hard drive space but both programs are very easy to download and use. Glaze also has a webtool available for free to human artists who do not use AI.