• Teppichbrand@feddit.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 month ago

    Innovation is a scam, it breeds endless bullshit we keep buying and talking about like 10 year olds with their latest gimmick.
    Look, they replaced this button with A TOUCHSCREEN!
    Look! This artficial face has PORES NOW!
    LOOK! This coffee machine costs 2000$ now and uses PROPRIATARY SUPEREXPENSIVE CAPSULES!!
    We need progress, which is harder to do because it takes a paradigm shift on an Individual and social level. It’s much less gadgety.

  • 31337@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 month ago

    A lot of the “elites” (OpenAI board, Thiel, Andreessen, etc) are on the effective-accelerationism grift now. The idea is to disregard all negative effects of pursuing technological “progress,” because techno-capitalism will solve all problems. They support burning fossil fuels as fast as possible because that will enable “progress,” which will solve climate change (through geoengineering, presumably). I’ve seen some accelerationists write that it would be ok if AI destroys humanity, because it would be the next evolution of “intelligence.” I dunno if they’ve fallen for their own grift or not, but it’s obviously a very convenient belief for them.

    Effective-accelerationism was first coined by Nick Land, who appears to be some kind of fascist.

  • TheFeatureCreature@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don’t want to purchase and run fleets of these absurd things or any other expensive hardware.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

      But all those locally-run models on laptop CPUs and desktop GPUs? That’s grid power being turned into heat and vented into a home (probably with air conditioning on).

      The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

      I do hate our media landscape sometimes.

      • Chee_Koala@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        But efficiency is not the only consideration, privacy and self reliance are important facets as well. Your argument about efficiënt computing is 100% valid but there is a lot more to it.

      • XeroxCool@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.

        It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.

        Greater efficiency doesn’t make things right if it comes with an increase in use.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 month ago

          For one thing, it’s absolutely not true that what these apps provide is the same as what we had. That’s another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

          For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it’s the equivalent of turning on your microwave oven.

          The argument that we are burning more power because we’re using more compute for entertainment purposes is not factually incorrect, but it’s both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

          The only reason you’re so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don’t have a reason to have an opinion about it.

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      25 days ago

      Doesn’t really make any sense. You could have 1 4090 running AI for a hundred people rather than a 4060 for a single person 24/7.

  • drawerair@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    22 days ago

    I like that the writer thought re climate change. I think it’s been 1 of the biggest global issues for a long time. I hope there’ll be increasing use of sustainable energy for not just data centers but the whole tech world in the coming years.

    I think a digital waiter doesn’t need a rendered human face. We have food ordering kiosks. Those aren’t ai. I think those suffice. A self-checkout grocer kiosk doesn’t need a face too.

    I think “client help” is where ai can at least aid. Imagine a firm that’s been operating for decades and encountered so many kinds of client complaints. It can feed all those data to a large language model. With that model responding to most of the client complaints, the firm can reduce the number of their client support people. The model will pass the complaints that are so complex or that it doesn’t know how to address to the client support people. The model will handle the easy and medium complaints; the client support people will handle the rest.

    Idk whether the government or the public should stop ai from taking human jobs or let it. I’m torn. Optimistically, workers can find new jobs. But we should imagine that at least 1 human will be fired and can’t find a new job. He’ll be jobless for months. He’ll have an epic headache as he can’t pay next month’s bills.

  • mPony@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    This article is one of the most down-to-earth, realistic observations on technology I’ve ever read. Utterly striking as well.

    Go Read This Article.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      This article is a regurgitation of every tech article since the microchip. There is literally nothing new here. Tech makes labor obsolete. Tech never considers the ramifications of tech.

      These things have been known since the beginning of tech.

      • akwd169@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        What about the climate impact? You didn’t even address that. That’s the worst part of the AI boom, were already way in the red for climate change, and this is going to accelerate the problem rather than slowing or stopping (let alone reversing it)

        • Not_mikey@slrpnk.net
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 month ago

          That’s a very solvable problem though, AI can easily be run off green energy and a lot of the new data centers being built are utilizing it, tons are popping up in Seattle with its abundance of hydro energy. Compare that to meat production or transportation via combustion which have a much harder transition and this seems way less of an existential problem then the author makes it out to be.

          Also most of the energy needed is for the training which can be done at any time, so it can be run on off peak hours. It can also absorb surpluses from solar energy in the middle of the day which can put strain on the grid.

          This is all assuming it’s done right, which it may not and could exasperate the ditch were already in, but the technology itself isn’t inherently bad.