Especially since so many anti-crypto people immediately pivoted to anti-AI. That sudden shift in priorities makes it hard to take them seriously.
Similarly, if I say "I object to the genocide in Gaza", would you assume that I don't also object to the Uyghur genocide?
This is nothing but whataboutism.
People are allowed to talk about the bad things AI does without adding a 3-page disclaimer explaining that they understand all the other bad things happening in the world at the same time.
If you take a strong argument and through in an extra weak point, that just makes the whole argument less persuasive (even if that's not rational, it's how people think).
You wouldn't say the "Uyghur genocide is bad because of ... also the disposable plastic crap that those slave factories produce is terrible for the environment."
Plastic waste is bad but it's on such a different level from genocide that it's a terrible argument to make.