Screenshot of this question was making the rounds last week. But this article covers testing against all the well-known models out there.

Also includes outtakes on the ‘reasoning’ models.

  • HugeNerd@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Instead it tries to make sense of it. Why? Because it learned strong language priors from us and it leans on that when the prompt is meaningless. It tries to make sense of it.

    No, it doesn’t. You’re in sci-fi land. There is no “it” “trying to make sense”. That cogitation is happening in YOU, not the motherboard.