Ah, never mind, you're right:
> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.
https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
Otherwise, they'd just keep doing it on the material that's actually uploaded.