• davel@lemmy.ml
    link
    fedilink
    English
    arrow-up
    44
    ·
    1 month ago

    Try feeding them nonhalting problems that send them into infinite loops of token consumption.

    • veroxii@aussie.zone
      link
      fedilink
      arrow-up
      17
      ·
      1 month ago

      I like the idea but most chatbots have timeout limits. And even agentic workflows have number of step limits to stop infinite loops.

      However this is because it’s super easy for LLMs to get stuck in loops. You don’t even need a nonhalting problem. They’re stupid enough on their own.

      • davel@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        Yeah I assumed they had some sort of breaker, but hitting that limit is still expensive for them, if you can get them to do it over & over with a script that does the prompting.