Also probably consider blocking them with .htaccess or your server's equivalent, such as here: https://ethanmarcotte.com/wrote/blockin-bots/
All this effort is futile because AI bots will simply send false user agents, but it's something.
I took my most bothered page IPv6-only, the AI bots vanished in the course of a couple days :-) (Hardly any complaints from actual users yet. Not zero, though.)
"As Tomchuk experienced, if a site isn’t properly using robot.txt, OpenAI and others take that to mean they can scrape to their hearts’ content."
The takeaway: check your robots.txt.
The question of how much load requests robots can reasonably generate when allowed is a separate matter.