Great point.
> A language model such as GPT-3 operates only on words, not concepts. It can make connections between words on the basis of statistical correlations, but has no capacity for encoding concepts, and therefore cannot "know" anything.
Are you sure? Aren't "concepts" encoded in how language is used, at least to some degree?
LeCun does say that models that explicitly attempt represent knowledge perform better than GPT-3 in terms of answering questions. I'm no expert but I believe him.
Good point and I think this shows up to the extent different languages might affect how we express particular concepts.
However I think it is more accurate to say that language solidifies and gives form to how we express concepts and the “concepts” themselves are independent of languages. Only our “expression” of these “concepts” depends on language.
For anyone interested in art and art history, this distinction was the central focus of the French surrealist painter Rene Magritte.
In our heads, language is a combination of words and concepts, and knowledge can be encoded by making connections between concepts, not simply words. If there is no concept or idea backing up the words, it can hardly be called knowledge. Consider the case of the man who did not speak French, yet memorised a French dictionary, and subsequently went on to win a Scrabble competition. Just because he knows the words, would you say he knows the language?
A language model such as GPT-3 operates only on words, not concepts. It can make connections between words on the basis of statistical correlations, but has no capacity for encoding concepts, and therefore cannot "know" anything.