Because that’s not how LLMs work.
When you form a sentence you start with an intent.
LLMs start with the meaning you gave it, and tries to express something similar to you.
Notice how intent, and meaning aren’t the same. Fact checking has nothing to do with what a word means. So how can it understand what is true?
All it did was take the meaning of looking for a number and strawberries and ran it’s best guess from that.
While it used to be closed source the maintainer a couple years back decided to not make it a job, and open sourced, took down the hosted option, and nowaintains it as a side project open sourced.