I would think there's more possibility of it leaking or being abused in a giant stockpile. Undoubtedly, those training sets would be commercialized in some way, potentially causing some to see that as adding insult to injury.
Why would you think that? Every distribution, every view is adding damage, even if the original victim doesn't know (or even would rather not know) about it.
I don't think AI training on a dataset counts as a view in this context. The concern is predators getting off on what they've done, not developing tools to stop them.
Debating what counts as a view is irrelevant. Some child pornography subjects feel violated by any storage or use of their images. Government officials store and use them regardless.
why?
the damage is already done