• KevonLooney@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    3
    ·
    5 months ago

    That’s actually really smart. But that info wasn’t given to doctors examining the scan, so it’s not a fair comparison. It’s a valid diagnostic technique to focus on the particular problems in the local area.

    “When you hear hoofbeats, think horses not zebras” (outside of Africa)

    • chonglibloodsport@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      AI is weird. It may not have been given the information explicitly. Instead it could be an artifact in the scan itself due to the different equipment. Like if one scan was lower resolution than the others but you resized all of the scans to be the same size as the lowest one the AI might be picking up on the resizing artifacts which are not present in the lower resolution one.

      • KevonLooney@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        5 months ago

        I’m saying that info is readily available to doctors in real life. They are literally in the hospital and know what the socioeconomic background of the patient is. In real life they would be able to guess the same.

      • Maven (famous)@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        The manufacturing date of the scanner was actually saved as embedded metadata to the scan files themselves. None of the researchers considered that to be a thing until after the experiment when they found that it was THE thing that the machines looked at.