If you have ever read the “thought” process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I’m not even sure this isn’t by design.
I dunno, let’s waste some water
They are trying to get rid of us by wasting our resources.
I’m pretty sure training is purely result oriented so anything that works goes
Why would it be by design? What does that even mean in this context?
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
Attack of the logic gates.
Five Nights at Altman’s
O cholera, czy to Freddy Fazbear?
This is gold

If software was your kid.
Credit: Scribbly G
Reminds me of that “have you ever had a dream” kid.
The AI touched that lava lamp






