• tabular@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Sorry if my replies are annoying, I merely find this subject interesting. Feel free to ignore this.

    It is not obvious to me why a being couldn’t have an “understanding” without a “thought”. I do not believe it’s possible to verify if a creature has a subjective experience but an “understanding” of the environment could be attributed to how a being performs in an environment (a crow gets foods that was just out of reach inside a tube of water by adding rocks to raise the water level). I have some experience on game-dev programming and limited understanding on consciousness as a personal interest, if that’s helpful.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      Oh no this is not annoying, this is a very interesting question.
      I suppose with the crow, it doesn’t need to understand volume of water and rocks displacing it, but merely has a more basic understanding that adding rocks raise the water, or maybe even just makes the food easier to get at.
      So I suppose we can agree that there are multiple levels of understanding.
      But still the crow must have observed this, unless it actually figured it out? And some thought process must have led it to believe that dropping stones in the water might have the desired effect.
      Now even if the crow had observed another crow doing this, and seen this demonstrated. Ir mist have had a thought process concluding that it could try this too, and perhaps it would work.

      But there are other situations that are more challenging IMO, and that’s with LLM, how do we decide thought and understand with those.
      LLM is extremely stupid and clever at the same time. With loads of examples of them not understanding the simplest things, like how meany R’s are in Strawberry, and the AI answering stubbornly that there are only 2! But on the other hand being able to spell it out and count them, then being able to realize that there are indeed 3, which it previously denied.

      IMO animal studies are crucial to understand our own intelligence, because the principle is the same, but animals are a simpler “model” of it so to speak.
      It seems to me that thought is a requirement to understanding. You think about something before you understand it.
      Without the thought process it would have to be instinctive. But I don’t think it can be argued that crows dropping rocks in water is instinctive.
      But even instinctive understanding is a kind of understanding, it’s just not by our consciousness, but by certain behavior traits having an evolutionary advantage, causing that behavior to become more common.

      So you are absolutely right that thought is not always required for some sort of “understanding”. which is a good point.
      But what I meant was conscious understanding as in really understanding a concept and for humans understanding abstract terms, and for that type of understanding thought is definitely a requirement.

      • tabular@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        The crows were shown how to get the food iir.

        My understanding is LLM contain artificial neural networks. A simplification with an amount of weights similar to small animals. A simpler model aught to make investigation more clear 😅

        Neural networks are “trained” by adjusting the weights on “neurons”. I assume real brains are training themselves on every input while LLM is limitted to sessions with training data. Do you suspect there could be a though process when it’s processing how many letters are in strawberry? What about when it’s weighs are adjusted during training?

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          I think whether you call it a thought process or not comes down to definition of what you mean by that. It’s definitely intelligence, and there definitely is a process.
          So I wouldn’t have a problem calling it a thought process. But it’s not self consciousness yet. But we may not be very far from it.
          It’s amazing the progress that has been achieved the past decade.
          When I predicted 2035 as a point where we could possibly achieve strong AI, it was at a point where we’d had 2-3 decades of very little progress. But I’ve always been certain that the human brain is a 100% natural phenomenon, and the function of it can be copied, just like with everything else in nature. And when that is achieved, there will still be room for improvement.
          As a natural process, our brain is built on the physical properties of atoms, so IMO it’s only a matter of time before we have an artificial intelligence that is just as valid to call self conscious as ourselves.