I can respect the first argument. I personally don't see any reason to believe AGI is impossible, but I also don't see evidence that it is possible with the current (very impressive) technology. We may never build an AGI in my lifetime, maybe not ever, but that doesn't mean it's not possible.
But the second argument, that humans do something machines aren't capable of always falls flat to me for lack of evidence. If we're going to dismiss the possibility of something, we shouldn't do it without evidence. We don't have a full model of human intelligence, so I think it's premature to assume we know what isn't possible. All the evidence we have is that humans are biological machines, everything follows the laws of physics, and yet, here we are. There isn't evidence that anything else is going on other than physical phenomenon, and there isn't any physical evidence that a biological machine can't be emulated.
At present we are churning out intelligent beings at an alarming rate with little understanding of what we are doing.
It’s a lot easier to imagine us creating an extended intelligence, manifestly without understanding it. Current work may even be a component of that.
It’s this second more pygmalion concept that a human mind could conceive of an artificial mind and create it from its machines, that I find a little fanciful.