Could it be fixed by splitting the text into smaller parts? Looks easy to implement.
disgruntledphd2
Yeah that's normally a good approach but you might end up using different words for the same concept in different parts unless you feed in more context which also comes out of the token limit.