From facts and processes we know, we can derive novel information and conclusions. If you really understand why the sky is blue, you might be able to come to conclusions about why other things appear a certain color, like human eye color.
GPT can't make those kinds of reasoning or extensions. It can only regurgitate what is already known and has been stated before, somewhere before in its training set.
It's very impressive, I just think people over-hype it into something it is not.
GPT can't make those kinds of reasoning or extensions. It can only regurgitate what is already known and has been stated before, somewhere before in its training set.
It's very impressive, I just think people over-hype it into something it is not.