It goes without saying, DVDs/BlueRays.

    • moseschrute@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      15 hours ago

      Not defending this, but it’s annoying because Google and all search engines results are being poisoned by AI written slop. It seems like LLMs may provide a better search experience, but it’s also the thing ruining the search experience.

      I don’t really know what I’m talking about, but I imagine if AI slop is ruining search, it will also start to ruin itself when the current slop is used to train future LLMs. Basically I think AI will short circuit itself long term.

      Now, will it short circuit itself enough for Microsoft to stop shoving it down our throats, probably not.

    • BlackPenguins@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      Yeah I do realize the inevitable problem when their sources dry up because no one is communicating anymore but for the quick questions about how something works in the world it’s extremely convenient. I’d just be asking Google anyway.

      • WeirdGoesPro@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        2 days ago

        But you can’t see the source of the information, which means it could be a reputable source, or it could be Joe-Sucks-His-Own-Dick from Reddit. In another comment, I pointed out that AI was telling people to put glue on pizza to keep the cheese from falling off—if you can see the source, you are much more likely to understand the veracity of the information.

      • WeirdGoesPro@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        2 days ago

        Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.

        If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.

        • venusaur@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          The silly take is that using it is “part of the problem”.

          Also, the glue on pizza thing is nearly a moot point. The models are much more advanced now and will continue to be.

          The commercial LLM’s can share their sources now so that’s also a moot point.

          It’s not going away. Learn to use it effectively.

        • BlackPenguins@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          3
          ·
          2 days ago

          You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.

            • BlackPenguins@lemmy.world
              link
              fedilink
              arrow-up
              5
              ·
              2 days ago

              If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.

              • WeirdGoesPro@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                1
                ·
                2 days ago

                If it’s something serious, yes.

                Good, then it is a bit less of a bad tool in this instance. Just don’t lose the habit of checking your sources—it’s a slippery slope.