• HarkMahlberg@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    9 months ago

    Let’s not confuse ourselves here. The opposite of one evil is not necessarily a good. Police reviewing their own footage, investigating themselves: bad. Unreliable AI beholden to corporate interests and shareholders: also bad.

    • pearsaltchocolatebar@discuss.online
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      It’s fine to not understand what “AI” is and how it works, but you should avoid making statements that highlight that lack of understanding.

      • tabular@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        If you feel one’s knowledge is lacking then explaining it may convince them, or others reading your post.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          9 months ago

          Speaking of a broad category of useful technologies as inherently bad is a dead giveaway that someone doesn’t know what they’re talking about.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Yep.

      Ya 'all like surveillance so much, let’s put all government employees under a camera all the time. Of all the places I find cameras offensive, that one not so much.

      • mndrl@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I sure hope you get your daily dosis of enjoying people’s misery watching the substitute teacher crying in the teacher’s lounge.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Cameras in a teacher’s lounge would be ridiculous but, in principle, cameras in classrooms make a lot of sense. Teachers are public officials who exercise power over others, and as such they need to be accountable for their actions. Cameras only seem mean because teachers are treated so badly in other ways.

          • mndrl@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            Sure thing, buddy. They exert such power that they can barely make teens stay put for dice minutes without fucking around with their phones. So much power.

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    If this works, why not have an AI automatically investigate Judges and government officials. The AI should indicate for example if the judge needs to recuse him or herself… That came up several times this year. And for politicians, the AI would tell us if they are lying or if they are allowing or taking part in corruption. For this purpose, they should wear a microphone and camera the entire time they are government officials. Don’t like it? Too bad, that’s the law. Right?

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      9 months ago

      why not have an AI automatically investigate Judges and government officials

      Because the power is supposed to originate with said Judges/Officials. The AI tool is a means of justifying their decisions, not a means of exerting their authority. If you give the AI power over the judges/officials, why would they want to participate in that system? If they were proper social climbers, they would - instead - aim to be CEOs of AI companies.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    I have a sneaking suspicion if police in places like America start using AI to review bodycam footage that they’ll just “pay” someone to train their AI so that way it’ll always say that the police officer was in the right when killing innocent civilians so that the footage never gets flagged That, or do something equally as shady and suspicious.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      9 months ago

      These algorithms already have a comical bias towards the folks contracting their use.

      Case in point, the UK Home Office recently contracted with an AI firm to rapidly parse through large backlogs of digital information.

      The Guardian has uncovered evidence that some of the tools being used have the potential to produce discriminatory results, such as:

      An algorithm used by the Department for Work and Pensions (DWP) which an MP believes mistakenly led to dozens of people having their benefits removed.

      A facial recognition tool used by the Metropolitan police has been found to make more mistakes recognising black faces than white ones under certain settings.

      An algorithm used by the Home Office to flag up sham marriages which has been disproportionately selecting people of certain nationalities.

      Monopoly was a lie. You’re never going to get that Bank Error In Your Favor. It doesn’t happen. The House (or, the Home Office, in this case) always wins when these digital tools are employed, because the money for the tool is predicated on these agencies clipping benefits and extorting additional fines from the public at-large.