• rekorse@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    14
    ·
    27 days ago

    We can do all those things without AI. Why do you care how fast it happens? If we could cure cancer twice as fast by grinding up baby animals would you do it?

    • Ookami38@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      ·
      26 days ago

      Probably not the best to imply you want cancer treatment research to slow down simply because you don’t like the tool used to do it. There’s a lot of shit wrong with our current implementations of AI, but let’s not completely throw the baby out with the bath, eh?

        • Ookami38@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          26 days ago

          why do you care how fast it happens

          I care how fast it happens because I don’t want it to slow down.

          • rekorse@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            23 days ago

            We can’t just use the fear of death to justify any means to prevent it. If we found out we could live eternally but had to destroy other creatures or humans to do so, we would consider that to be too high a cost.

            The cost for AI at the moment is just immoral. Even those who have found methods to deal with the costs, are still benefiting from calling it AI in the form of investments and marketing. Calling their work AI is worth money because of all of this fraudulent behavior.

            If I started growing and producing my own organic abuse free heroin and selling it, it would still be immoral because I’m benefiting from the economy created by the illegal market. I’m participating in that market despite my efforts.

            Ive said before that if these companies doing the ethical AI stuff want to stop being criticized for being part of this AI nonsense, feel free to call it something else. AI is overly broad and applied incorrectly all the time as it is anyways, and is mainly applied to things to draw money and interest that otherwise wouldnt exist.

            Its a way to signal to investors that there is a profit incentive to be focused on here.

            • Ookami38@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              23 days ago

              We can’t just use the fear of death to justify any means to prevent it. If we found out we could live eternally but had to destroy other creatures or humans to do so, we would consider that to be too high a cost.

              Sure, there are costs that are too high for anything.

              The cost for AI at the moment is just immoral. Even those who have found methods to deal with the costs, are still benefiting from calling it AI in the form of investments and marketing. Calling their work AI is worth money because of all of this fraudulent behavior.

              This is the part where it breaks down though. There’s nothing inherently immoral about AI. It’s not the concept of AI you have problems with. It’s the implementation. I hate a lot of the implementation, too. Shoehorning an AI into everything, using AI to justify a reduction in labor, that all sucks. The tool itself, though? Pretty fuckin awesome.

              If I started growing and producing my own organic abuse free heroin and selling it, it would still be immoral because I’m benefiting from the economy created by the illegal market. I’m participating in that market despite my efforts.

              Are we comparing this to cancer research still? If so that’s a bit of a WILD statement. It’s pretty close to the COVID vaccine denial mentality - because it was made using something I don’t like/fully understand, it must be bad.

              Ive said before that if these companies doing the ethical AI stuff want to stop being criticized for being part of this AI nonsense, feel free to call it something else. AI is overly broad and applied incorrectly all the time as it is anyways, and is mainly applied to things to draw money and interest that otherwise wouldnt exist.

              Ok let’s go back to drugs, then. If we were making your organic, free trade heroin, but called it beroin so that we’re not piggybacking off the heroin market, we’re good? No, that doesn’t make sense. Heroin will fuck up someone’s life regardless of what you call it, how it was produced, eetc.There’s (virtually) no legitimate, useful application of heroin. Probably not one we’d ever see the production of broadly okayed.

              Conversely, you’ve already agreed that there are ethical uses and applications of AI. It doesn’t matter what the name is, it’s the same technology. AI has become the term for this technology, just like heroin has become the term for that drug, and it doesn’t matter what else you want to call it, everyone already knows what you mean. It doesn’t matter what you call it, its uses are still the same. It’s impact is still the same.

              So yeah, if you just have a problem with, say, cancer researchers using AI, and would rather them use, idk, AGI or any of the other alternative names, I think you’re missing the point.

              • rekorse@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                19 days ago

                I’m not saying they should do research at all, just take steps to separate yourselves from the awful practices practiced by the bug players right now.

                We should be able to talk about advances in cancer research without having to have a discussion about how AI is going overall, including the shitty actors.

                And, to be fair, most of the good projects you are defending do differentiate themselves very publicly about the differences and how they are more responsible. All I’m saying is that is a good thing. Companies should be scrambling to distance themselves from openAI, copilot, and whatever else the big tech companies have created.

    • pyre@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      10
      ·
      edit-2
      27 days ago

      yes. i love animals but if my kid has a cold and i knew a puppy’s breathing caused it, I would drown that puppy myself. let alone finding a cure for fucking cancer.

      that being said AI isn’t doing that and even if it is I wouldn’t trust the results.

      • rekorse@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        23 days ago

        Hey at least you owned the logical conclusion of your argument. I can respect that.

        I do disagree, but I’m also vegan so thats probably why.