• Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    Yeah, had that on my very first attempt at using it.

    It used a component that didn’t exist. I called it out and it went “you are correct, that was removed in <older version>. Try this instead.” and created an entirely new set of bogus components and functions. This cycle continued until I gave up. It knows what code looks like, and what the excuses look like and that’s about it. There’s zero understanding.

    It’s probably great if you’re doing some common homework (Javascript Fibonacci sequence or something) or menial task, but for anything that might reach the edges of its “knowledge”, it has no idea where those edges may lie so just bullshits.

    • db0@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      “Hallucinate” is the standard term used to explain the GenAI models coming up with untrue statements

      • Cyrus Draegur@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        in terms of communication utility, it’s also a very accurate term.

        when WE hallucinate, it’s because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.

        when AIs hallucinate, it’s due to its predictive model generating results that do not align with reality because it instead flew off the rails presuming what was calculated to be likely to exist rather than referencing positively certain information.

        it’s the same song, but played on a different instrument.

        • kronisk @lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          when WE hallucinate, it’s because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.

          Is it really? You make it sound like this is a proven fact.

  • anlumo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I just want an LLM with a reasonable context window so we can actually write real working packages with it.

    The demos look great, but it’s always just around 100 lines of code, which is beginner level. The only use case right now is fake packages.

  • RustyNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    *bad Devs

    Always look on the official repository. Not just to see if it exists, but also to make sure it isn’t a fake/malicious one

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      *bad Devs

      Or devs who don’t give a shit. Most places have a lot of people who don’t give a shit because the company does not give a shit about them either.

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        What’s the diff between a bad dev and a dev that doesn’t care? Either way, whether ist lack of skill or care, a bad dev is a bad dev at the end of the day.

        • Obinice@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I can be good at a trade, but if I’m working for a shit company with shit pay and shit treatment, they’re not going to get my best work.

          You get out what you put in, that’s something employers don’t realise.