• ours@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    4 months ago

    As long as it’s not Chromium, I’m happy people aren’t just handing over the keys to the Internet to Google.

    • FiniteBanjo@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      14
      ·
      4 months ago

      Yeah, Waterfox is just another browser built on top of the Mozilla’s GECKO engine. But without all the AI dickriding.

      • ours@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        1
        ·
        4 months ago

        How terrible to offer client-side translation or webpage description for differently abled people!

              • stephen01king@lemmy.zip
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                4 months ago

                Yes, so show me how incorrect is their translation, since you claim it to be incorrect.

                • FiniteBanjo@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  4 months ago

                  LLM and ML generated translations generate a series of tokens individually. That’s why AI Chatbots hallucinate so often, they decide the next most likely word in a sequence is “No” when the correct answer would be “Yes” and then the rest of the prompt devolves into convincing nonsense.

                  • stephen01king@lemmy.zip
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    4 months ago

                    Those are not examples, just what you claim will happen based on what you think you understand about how LLM works.

                    Show me examples of what you meant. Just run some translations in their AI translator or something and show me how often they make inaccurate translations. Doesn’t seem that hard to prove what you claimed.