• ColeSloth@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    I use an APK called Off Grid and load ai onto that (right now I’m using genna 4). It’s all done on my phone. Nothing on the cloud. No data sent anywhere. Completely local. No entities get shit from it. The only way I’ll use ai.

    • qualia@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 day ago

      I highly value privacy, but the gap between local LLMs vs top of the line cloud LLMs (e.g. Claude & DeepSeek) is still too great for me to switch completely to the former.

      I’ll use PWAs to sandbox LLMs from everything else (and each other) and try to create semantic distance between the user and the queries.

      How about that leaked Claude source code? Is there a reliably clean version of that available anywhere yet?

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I have my local LLM currently setup and it runs just as well as Sonnet 4.6 from a quality standpoint, and for performance it is slightly slower but it’s still faster than I can respond.

        This is with a Strix Halo APU with 128GB unified memory using the latest Qwen3.6 models with llama.cpp.