Mine attempts to lie whenever it can if it doesn’t know something. I will call it out and say that is a lie and it will say “you are absolutely correct” tf.
I was reading into sleeper agents placed inside local LLMs and this is increasing the chance I’ll delete it forever. Which is a shame because it is the new search engine seeing how they ruined search engines

Good comment. But the way it does it feels pretty intentional to me. Especially when it admits that it just lied so that I could give an answer, whether the answer was true or false