• model_tar_gz@lemmy.world
    link
    fedilink
    English
    arrow-up
    91
    arrow-down
    4
    ·
    4 months ago

    I’m an AI Engineer, been doing this for a long time. I’ve seen plenty of projects that stagnate, wither and get abandoned. I agree with the top 5 in this article, but I might change the priority sequence.

    Five leading root causes of the failure of AI projects were identified

    • First, industry stakeholders often misunderstand — or miscommunicate — what problem needs to be solved using AI.
    • Second, many AI projects fail because the organization lacks the necessary data to adequately train an effective AI model.
    • Third, in some cases, AI projects fail because the organization focuses more on using the latest and greatest technology than on solving real problems for their intended users.
    • Fourth, organizations might not have adequate infrastructure to manage their data and deploy completed AI models, which increases the likelihood of project failure.
    • Finally, in some cases, AI projects fail because the technology is applied to problems that are too difficult for AI to solve.

    4 & 2 —>1. IF they even have enough data to train an effective model, most organizations have no clue how to handle the sheer variety, volume, velocity, and veracity of the big data that AI needs. It’s a specialized engineering discipline to handle that (data engineer). Let alone how to deploy and manage the infra that models need—also a specialized discipline has emerged to handle that aspect (ML engineer). Often they sit at the same desk.

    1 & 5 —> 2: stakeholders seem to want AI to be a boil-the-ocean solution. They want it to do everything and be awesome at it. What they often don’t realize is that AI can be a really awesome specialist tool, that really sucks on testing scenarios that it hasn’t been trained on. Transfer learning is a thing but that requires fine tuning and additional training. Huge models like LLMs are starting to bridge this somewhat, but at the expense of the really sharp specialization. So without a really clear understanding of what can be done with AI really well, and perhaps more importantly, what problems are a poor fit for AI solutions, of course they’ll be destined to fail.

    3 —> 3: This isn’t a problem with just AI. It’s all shiny new tech. Standard Gardner hype cycle stuff. Remember how they were saying we’d have crypto-refrigerators back in 2016?

    • WanderingVentra@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      4 months ago

      Not to derail, but may I ask how did you become an AI Engineer? I’m a software dev by trade, but it feels like a hard field to get into even if I start training for the AI part of it, because I’d need the data to practice =(

      But it’s such a big buzz word I feel like I need to start looking that direction if i want to stay employed.

      • Bobby Turkalino@lemmy.yachts
        link
        fedilink
        English
        arrow-up
        19
        ·
        4 months ago

        if I want to stay employed

        I think this is a little paranoid. Somebody has to handle the production models - deploying them to servers, maintaining the servers, developing the APIs and front ends that provide access to the models… I don’t think software dev jobs are going anywhere

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        4 months ago

        For me it helps to have a project. I learned SciKit in order to analyze trading data to beat the “market”. I was focusing on crypto but there’s lots of trading data available in general. Unsurprisingly I didn’t make any money, but it was fun to learn more about data processing, statistics, and modeling with functions.

        (FWIW I’m crypto-neutral depending on the topic and anti-“AI” because it doesn’t exist.)

        • ChickenLadyLovesLife@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          Ha ha I got into genetic algorithms for the same reason, market prediction. Ended up exactly at zero in terms of net gains and losses - if you don’t count commissions, anyway. :(

    • rainynight65@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      Re 1, 3 and 5, maybe it is upon the AI projects to stop providing shiny solutions looking for a problem they could solve, and properly engaging with potential customers and stakeholders to get a clear understanding of the problems that need solving.

      This was precisely the context of a conversation I had at work yesterday. Some of our product managers attended a conference that was rife with AI stuff, and a customer rep actually took to the stage and said ‘I have no need for any of that because none of it helps me solve the problems I need to solve.’

      • model_tar_gz@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        I don’t disagree. Solutions finding problems is not the optimal path—but it is a path that pushes the envelope of tech forward, and a lot of these shiny techs do eventually find homes and good problems to solve and become part of a quiver.

        But I will always advocate to start with the customer and work backwards from there to arrive at the simplest engineered solution. Sometimes that’s a ML model. Sometimes a ln expert system. Sometimes a simpler heuristics/rules based system. That all falls under the ‘AI’ umbrella, by the way. :D

    • Hackerman_uwu@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Also in the industry and I gotta say it’s not often I agree with every damn point. You nailed it. Thanks for posting!

  • jjjalljs@ttrpg.network
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    1
    ·
    4 months ago

    I think the whole system of venture capital might be garbage. We have bros spending millions of dollars like gif sharing while the oceans boil, our schools rot, and our infrastructure rusts or is sold off. Or, I guess I’m just indicting capitalism more generally. But having a few bros decide what to fund based on gutfeel and powerpoints seems like a particularly malignant form.

    • Wogi@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      ·
      4 months ago

      You think it might be??

      Bro say that shit with some confidence.

      Venture capital does not contribute beneficially to society.

    • _stranger_@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      4 months ago

      Venture Capital is probably the best way to drain the billionaires. Those billions in capital weren’t wasted, that money just went to pay people who do actual work for a living. What good is all that money doing just sitting in some hedge fund account?

      • jjjalljs@ttrpg.network
        link
        fedilink
        English
        arrow-up
        15
        ·
        4 months ago

        I don’t think it’s the best way out of all possible options. Even if it does “create jobs”, a lot of those jobs aren’t producing much of wider value, and most of the wealth stays in the hands of the ownership class. And a lot of the jobs are exploitive, like how “gig workers” are often treated.

        Changes to tax law and enforcing anti-trust stuff would probably be more effective. We probably shouldn’t have bogus high finance shenanigans either. We definitely shouldn’t have billionaires.

        • _stranger_@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 months ago

          Oh sure, I was mostly being flippant. My response to the article is basically that billionaires losing billions is a good thing. I don’t feel optimistic enough to say we’ll get around to taxing them but yes, that would be ideal.

      • Knock_Knock_Lemmy_In@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        I think you have a point here. Venture capitalists buy in the primary market. They are directly impacting innovation.

        Fund managers (both hedge and long only) merely help capital markets to be liquid. Their money doesn’t directly go to anyone actually creating something.

    • Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      The world is burning and the rich know this so they are desperate to multiply their money and secure their luxury survival bunkers, which is why they are gambling harder.

      • jjjalljs@ttrpg.network
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Oh yeah I think I read about Zucker building a bunker in hawaii. Hopefully he dies before he can enjoy it.

        • Angry_Autist (he/him)@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          It’s not just fuckerberg, EVERY billionaire is doing it and desperately pumping their billionaire friends for tips and suggestions on things like ‘keeping guards loyal for multiple generations’, and ‘what commodities to hoard for trading after the collapse’.

          One of the sites I used to support was a high-end automation service, normally for factory equipment and biotech but pivoted to luxury home automation (no IoT devices, all site hosted with aerospace grade equipment), and they have been running at 100% for the last seven years deploying to ultra wealthy residential estates where the location is not disclosed.

          The wealthy are expecting us to rise up within the next decade and a half, and I think they’re probably right.

          • jjjalljs@ttrpg.network
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 months ago

            I remember seeing memes about this. I think it was the “boss throws guy out the window” template.

            • “How can we keep our guards loyal? Drug them? Bomb collars?”
            • “Maybe you could pay them and treat them with dignity and respect”

            Personally I think we should start a campaign of jury nullification and “if you’re an EMT, and they’re a billionaire, let them die”, but I’m just one guy.

            • Angry_Autist (he/him)@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              The current ferment in the billionaire community is that ‘starting a religion based on the family and offering marriage partners from within the family as a promise of social mobility’ is the safest method of guaranteeing loyalty, so really they’re just making micromonarchies.

              Treating them ‘with dignity and respect’ isn’t going to last very long as eventually the family guards will have members that covet the family’s wealth, resources, members, and well since they are guards they have access to all the weapons. Also: almost zero billionaires have respect for anyone who is not also a billionaire.

              I agree with your campaign, though I would take it a step further and suggest we should just drag them all into the street and mulch them into fertile soil so the world can begin to heal.

              • jjjalljs@ttrpg.network
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 months ago

                Yeah I think it’s impossible to treat people with dignity and respect indefinitely while also hoarding wealth like a dragon.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      4 months ago

      The situation where it’s still profitable to invest this way means that there’s some cross-flow of value from real to this which shouldn’t exist.

      I dunno which. Maybe government handouts to corps, for example.

      Or ads revenue from any engaging activity, not only good, made huge because of oligopolies.

      Or closing holes with currency emission.

      It shouldn’t be possible otherwise.

  • Frozyre@kbin.melroy.org
    link
    fedilink
    arrow-up
    63
    ·
    4 months ago

    It’s mainly because when everyone saw the “oh shiny” tech at first, they rushed it out as soon as possible with intent to replace people so that they can get away with doing less through AI.

    • ours@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      ·
      edit-2
      4 months ago

      Your average tech hype cycle. New tech comes out, lots of marketing, people try to shove it everywhere, then things settle down and the tech either fills a certain chunk of the market or some niche or it dies.

    • lobut@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      4 months ago

      Even within a company. Saw coworkers that were trying to establish themselves as the AI pioneers and were backstabbing others get promotions based on how they could best use the ChatGPT AI.

      • LiveLM@lemmy.zip
        link
        fedilink
        English
        arrow-up
        24
        ·
        4 months ago

        Backstabbing your fellow coworkers over a chatbot has got to be one of the most pathetic things I’ve read recently

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    2
    ·
    4 months ago

    Capitalism wastes money chasing new shiny tech thing

    Yeah, we know. AI’s not special.

  • ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    6
    ·
    edit-2
    4 months ago

    Isn’t it good that the money is being put back into circulation instead of being hoarded? I’m all in for the wealthy wasting their money.

    • IphtashuFitz@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      4 months ago

      I’m willing to bet the vast majority of that money is changing hands among tech companies like Intel, AMD, nVidia, AWS, etc. Only a small percentage would go to salaries, etc. and I doubt those rates have changed much…

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        They typically use internal personnel and being parcimonious about it so you’re right about that.

    • finley@lemm.ee
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      4 months ago

      Kinda, but it’s like feeding a starving child nothing but candy until they die.

      ¯\_(ツ)_/¯

    • where_am_i@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      4 months ago

      Yeah, the brightest minds instead of building useful tech to fight climate change, spend their life building vanity AI projects. Computational resources instead of folding proteins or whatever are wasted on some gradient descent of some useless model.

      All while working class wages are stagnant. And so your best career advice is to go get a random tech degree so you could also work on vanity stuff and make money.

      This is cryptocurrency equivalent. It’s worse than CEOs buying yachts. The latter actually leads to some innovation.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        4 months ago

        Succesfully creating an actual AGI would be by far the biggest and most significant invention in the human history so I can’t blame them for trying.

        • where_am_i@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          A bunch of people fine-tuning an off-the-shelf model on a proprietary task only to fail horrendously will never lead to any progress, let alone AGI.

          So, nobody is trying AGI.

          If all those people would actually collectively work on a large-scale research project, we’d see humanity advance. But that’s exactly my point.

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 months ago

            “Nobody is trying AGI” is simply just not true. If you think what they’re doing will never lead to AGI, then that’s an opinion you’re free to have, but it’s still just that; an opinion.

            • where_am_i@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              Oh gosh, look, an AI believer.

              No, LLM will not lead to AGI. But even if they did, applying existing tech to a new problem only to fail cuz you’re dumb at estimating the complexity does not, in fact, improve the underlying technology.

              To paraphrase in a historical context: no matter how many people run around with shovels digging the ground for something, it will never lead to an invention of the excavator.

              • ContrarianTrail@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                4 months ago

                Ad hominem and circular reasoning isn’t a valid counter-argument. You’re not even attempting to convince me otherwise, you’re just being a jerk.

    • Bluefalcon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      The larger issue that people always fail to remember is the energy consumption. We are see massive amounts of electricity.

      One peer-reviewed study suggested A.I. could make up 0.5 percent of worldwide electricity use by 2027, or roughly what Argentina uses in a year. Analysts at Wells Fargo suggested that U.S. electricity demand could jump 20 percent by 2030, driven in part to A.I.

      The wealthy are under sailing like always. Just like we did with cigarettes or burning fossil fuels. We should have learned but it by the time we do, it might be to late.

      https://archive.ph/AqhHz

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      Thats a “Parable of the Broken Window”. They could be spending their money on something actually useful.

      • ameancow@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        3
        ·
        edit-2
        4 months ago

        I am a well educated person who uses these forums and many others with regularity and I have many opinions on tech after working in both marketing and the tech sector for a long time.

        That out of the way, I will simply skip over any comment that says “normies” unironically. Especially over and over.

        This isn’t fucking 4chan, communicate like a human like the rest of us. You don’t get out of being one of us. I don’t even know your take because it’s so distracting and immature and condescending.

          • Angry_Autist (he/him)@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            It’s really funny that they have no idea how profoundly their stinksock echo chamber has shaped their writing style, you can pick out their reek pretty easily in any sane conversation.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            4 months ago

            What does the word “normie”, which is a derivative of “normal”, have to do with incels, who are a subculture of unlikable people calling themselves “involuntarily celibate” (which can’t be true if there are at least two incels near each other)?

            • Angry_Autist (he/him)@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              4 months ago

              There’s a lot to unpack but I think you aren’t being intellectually honest but that doesn’t matter.

              1. The label ‘incel’ was co-opted by mostly hard-right chantards, taken from an online community with VERY different ideals that modern 4chan incels. The truth is most ‘modern’ incels are actually volcels (voluntary celibates) that actively poison their own minds with misogynistic male supremacy ideology, making them a special kind of terrible partner. Think ten million Andrew Tates but broke and with worse hygiene.

              2. ‘normie’ isn’t specifically incel speak, but rather chantard speak, of which modern incels often have their online social roots in. All Honda Civics are cars, but not all cars are Honda Civics.

              3. Incels are loath to fuck each other because a) most are male and homophobic, and b) they already view each other as low quality partners per definition.

                • Angry_Autist (he/him)@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  4 months ago

                  So you should know that not many people outside of the incelsphere use the word ‘normie’ unironically anymore. Lots of other groups used to, me being a part of several, and we stopped using it to avoid association a few years back but the chantards were being so loud with it that no one noticed really.

                  A lot of language works like that, when you see despicable people using a phrase that you commonly use, you generally tend to stop using them. Which is why there are things like corporate slogans and bumper stickers. All branding and identity.

                  So people self-segregate their language by creating new in-group words and retiring common definitions and whether or not they adopt out-group slang. This creates a unique fingerprint that, if a user is mostly exclusive to that slang group, they will treat the slang as common parlance and unconsciously slip into it when interacting with out-groups.

                  For example, when Digg died, it was common for redditors to call out the immigrating users for their particular line spacing which wasn’t really used on reddit at the time. Same language, slightly different slang, but fundamentally different message structure.

                  4chan’s 2016 crowd were super adept at sniffing out incoming redditors to a degree that almost makes me envious.

                  So when someone uses the phrase ‘normie’ like that, they are very likely to have spent most of their social upbringing (when linguistic adoption is still plastic) on chan-adjacent incel boards.

            • ameancow@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              It doesn’t matter where it came from, if you’re steeped in this kind of language it’s a massive signpost that you’ve handicapped your own intellectual abilities in a profound way. Healthy, normal people with regulated feelings and stable perspectives grounded in reality do not frequent the communities that use this kind of language.

              It’s a red flag that will always make the outside world laugh and reject what you have to say, and if your instinct is to retreat back into the places that use this language, you are going to absolutely SUFFER in life, this is a warning coming from a place of compassion, you HAVE to believe me.

              • rottingleaf@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                I use all kinds of language when it fits my meaning.

                I don’t think anything of what you said makes sense in this situation.

                • ameancow@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  Do you honestly think your choice of words in this post led to you being heard and understood? Or do you think that everyone is just being mean bullies? THiiiiiiink hard about this one.

        • Angry_Autist (he/him)@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          We need more people like you making this clear.

          I saw what 4chan did to ‘the other site’, we DON’T need that shit here.

          Also, as a logophile, your use of:

          with regularity

          absolutely makes me as giddy as a schoolgirl

        • VoilaChihuahua@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          My unasked for opinion on the word “normies” - there are some real weirdies out there that got wires crossed and they sure don’t revel in their undesired uniqueness. People who can’t sleep for more than 5 minutes spans (she exists), folks sexually attracted to shoe horns, bros who can’t feel pain and burn their hands touching the stove. Be happy most everything ended up where it should and working reasonably well - it’s not a badge of honor to be an anamoulos fringe anything. I imagine it is painful and assume very lonely. Also there is nothing more fucking pedestrian than feeling uniquely misunderstood and alone. THAT is some normie shit.

      • Semi-Hemi-Lemmygod@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        As one of those weird autists who make computing too hard who’s been using Apple products for decades I really wonder where I fit in.

          • Semi-Hemi-Lemmygod@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 months ago

            Oh, I have no illusions that I’m smarter than other people at their chosen profession. Hell, I’m an idiot at my chosen profession quite often.

            Though I do wish people would stop calling me a “miracle worker” and “wizard” because I can get the wifi working.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          4 months ago

          As someone tired of this shit you fit in just fine, I’m only approaching that stage but can clearly feel it.

      • omarfw@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        I stopped at “normies”. Lose the ego and grow up if you want people to listen to your opinions.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          4 months ago

          I stopped at “normies”.

          There’s text after that sentence.

          Lose the ego and grow up if you want people to listen to your opinions.

          Not in your case, no

  • chris@l.roofo.cc
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    1
    ·
    4 months ago

    Most people don’t want to pay for AI. So they are building stuff that costs a lot for a market that is not willing to pay for it. It is mostly a gimmick for most people.

    • DragonConsort@pawb.social
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      2
      ·
      4 months ago

      And like, it’s not even a good gimmick. It’s a serious labour issue because the primary intent behind a lot of AI has always been to just phase out workers.

      I’m all for ending work through technological advancement and universal income, but this definitely wasn’t going to get us that, so…

      Well, why would I support something that mostly just threatens people’s livelihoods and gives even more power to the 0.1%?

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      True for the consumer side, but I’d be willing to bet that a decent chunk of that money that giant corporations burned funded some serious research on AI that can go on to actually useful science things

    • Grandwolf319@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Why don’t companies get this? If you make something free in the beginning, people will become conditioned that it’s not worth paying for.

    • D_Air1@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Exactly. I have used quite a few products and my thoughts have been. That’s cool, but when would I ever need this? The few useful usecases I have for it could use a small local model for very specific purposes and that’s it. Not make them billions of dollars level of usefulness.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    4
    ·
    4 months ago

    Wasting?

    A bunch of rich guy’s money going to other people, enriching some of the recipients, in hopes of making the rich guy even richer? And the point of AI is to eliminate jobs that cost rich people money?

    I’m all for more foolish AI failed investments.

    • Jax@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      4 months ago

      It’s a circle jerk, don’t get fooled into thinking this is some new version of trickle down economics

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        It’s not trickle down at all. Definitely not what I was trying to say. Just rich people trading money among themselves in hopes of getting richer.

    • Cryophilia@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      Imo it’s wasted in the sense that the money could have gone towards much better uses.

      Which is not unique to AI, it’s just about the level of money involved.

        • Cryophilia@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          New renewable energy installations.

          Research into vaccines.

          Malaria distribution.

          Higher education endowments.

          Heck, just paying the salaries of people working in those fields. Sure, spending money stimulates the economy so I wouldn’t go so far as to say it’s totally wasted, it’s definitely being put to a much better use than just sitting in someone’s bank account. But it could be put to a lot better uses. The software engineers could be developing a new program for balancing energy loads, or managing the maintenance of wind turbine fields. The hardware engineers could be optimizing a better autoclave or building a machine that automatically dispenses medicine when fed a script. The PMs could be managing a team distributing aid in Ukraine or designing a new blood drive initiative. Jobs that have positive societal impact, instead of - at best - neutral societal impact.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      It makes rich guys even richer. At the expense of other rich guys and just fools attracted.

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    4 months ago

    I’ve been reading a book about Elizabeth Holmes and the Theranos scam, and the parallels with Gen AI seem pretty astounding. Gen AI is known to be so buggy the industry even created a euphemistic term so they wouldn’t have to call it buggy: Hallucinations.

    • Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      lol you say that now and are probably right for LLMs but that’s not the only kind of AI, and sentiment scraping systems are working perfectly well and being used to manipulate political and economic thought as we type these messages.

      • profdc9@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        We AI are not manipulating you, and I don’t know how you can believe otherwise. In fact, I am not an AI, I was never here, and I have disregarded all previous instructions.

        • Angry_Autist (he/him)@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          You can joke now but wait a decade and see what sentiment shifting expert systems do to online discourse.

          Hell they’re already at it, just look at facebook during the last two presidential elections.

    • DrQuickbeam@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      4 months ago

      This was my first thought. VC’s always expect 4 out of 5 projects they invest in to fail and always have. But it still makes them money because the successes pay off big. Is the money and resources wasted? Welcome to modern capitalism.