• 6 Posts
  • 227 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle

  • Good to hear you figured it out with router settings. I’m also new to this but got all that figured out this week. As other commenters say I went with a reverse proxy and configured it. I choose caddy over nginx for easy of install and config. I documented just about every step of the process. I’m a little scared to share my website on public fourms just yet but PM me ill send you a link if you want to see my infrastructure page where I share the steps and config files.





  • From what Ive seen in arguments about this, Plex generally is more accessible with QoL and easier to understand interface for non-techie people to share with family/friends. Something thats hard for nerdy people to understand is that average people are perfectly fine paying for digital goods and services. An older well off normie has far more money than sense and will happily pay premiums just to not have to rub two braincells together with setup or for a nicer quality of experience. If you figure out how to make a very useful plug-an-play service that works without the end user of average intelligence/domain knowledge stressing about how to set up, maintain, and navigate confusing layouts, you’ve created digital gold.

    This isn’t the fault of open source services you can only expect so much polish from non-profit voulenteer. Its just the nature of consumer laziness/expectation for professional product standards and the path/product of least resistance.


  • I volunteer as developer for a decade old open source project. A sizable amount of my contribution is just cooking up decent documentation or re-writting old doc from the original module authors written close to a decade ago because it failed me information wise when I needed it. Programmers as it turns out are very ‘eh, the code should explain itself to anyone with enough brains to look at it’ type of people so lost in the sauce of being hyperfluent tech nerds instantly understanding all variables, functions, parameters, and syntax at very first glance at source code, that they forgot the need for re-translation into regular human speak for people of varying intelligence/skill levels who can barely navigate the command line.


  • True! Most browsers don’t have native gemini protocol support. However a web proxy like the ones I shared allow you to get gemini support no matter the web browser. Gemtext is a simplified version of markdown which means its not too hard to convert from gemtext to html/webpage. So, by scraping information from bloated websites, formatting it into the simple gemtext format markdown, then mirroring it back as a simple web/html page, it works together nicely to re-render bloated sites on simple devices using gemini as a formatting medium technology. You don’t really need to understand gemini protocol to use newswaffle + portal.mozz.us proxy in your regular web browser


  • Try to start curating your block list. Whenever you see a post or a community that rubs you the wrong way/you aren’t interested in block it. Try to identify key words that help you blacklist the stuff you dont want to see.

    The overwhelming Lemmy biases and echo chamber ideologies can be a little grating for sure. The longer your here the better curated your feed and the more you will find/contribute to nonpolitical communities which drastically improve the experience.



  • They are similar and use some of the same underlying technology powered by the readability library, but newswaffle gives more options on how to render the article (article mode, link mode, raw mode), it isolates images and gives them their own external url link you can click on, it tells you exactly how much cruft it saved from original webpage (something about seeing 99.x% lighter makes my brain tingle good chemicals). It works well with article indexes. You can bookmark a newswaffle page to get reader view by default instead of clicking a button in firefox toolbar. Hope these examples help.





  • SmokeyDope@lemmy.worldtoSelfhosted@lemmy.worldlightweight blog ?
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    Would something like this interest you? Gemtext formatted to html is about as light weight as it gets. lots of automatic gemtext blog software on github that also formats and mirrors an html copy. Whenever a news page article gets rendered to gemtext through newswaffle it shrinks about 95-99% of the page size while keeping text intact. Let me know if you want some more information on gemini stuff.




  • I just spent a good few hours optimizing my LLM rig. Disabling the graphical interface to squeeze 150mb of vram from xorg, setting programs cpu niceness to highest priority, tweaking settings to find memory limits.

    I was able to increase the token speed by half a second while doubling context size. I don’t have the budget for any big vram upgrade so I’m trying to make the most of what ive got.

    I have two desktop computers. One has better ram+CPU+overclocking but worse GPU. The other has better GPU but worse ram, CPU, no overclocking. I’m contemplating whether its worth swapping GPUs to really make the most of available hardware. Its bee years since I took apart a PC and I’m scared of doing somthing wrong and damaging everything. I dunno if its worth the time, effort, and risk for the squeeze.

    Otherwise I’m loving my self hosting llm hobby. Ive been very into l learning computers and ML for the past year. Crazy advancements, exciting stuff.



  • I run kobold.cpp which is a cutting edge local model engine, on my local gaming rig turned server. I like to play around with the latest models to see how they improve/change over time. The current chain of thought thinking models like deepseek r1 distills and qwen qwq are fun to poke at with advanced open ended STEM questions.

    STEM questions like “What does Gödel’s incompleteness theorem imply about scientific theories of everything?” Or “Could the speed of light be more accurately refered to as ‘the speed of causality’?”

    As for actual daily use, I prefer using mistral small 24b and treating it like a local search engine with the legitimacy of wikipedia. Its a starting point to ask questions about general things I don’t know about or want advice on, then do further research through more legitimate sources.

    Its important to not take the LLM too seriously as theres always a small statistical chance it hallucinates some bullshit but most of the time its fairly accurate and is a pretty good jumping off point for further research.

    Lets say I want an overview of how can I repair small holes forming in concrete, or general ideas on how to invest financially, how to change fluids in a car, how much fat and protein is in an egg, ect.

    If the LLM says a word or related concept I don’t recognize I grill it for clarifying info and follow it through the infinite branching garden of related information.

    I’ve used an LLM to help me go through old declassified documents and speculate on internal gov terminalogy I was unfamiliar with.

    I’ve used a speech to text model and get it to speek just for fun. Ive used multimodal model and get it to see/scan documents for info.

    Ive used websearch to get the model to retrieve information it didn’t know off a ddg search, again mostly for fun.

    Feel free to ask me anything, I’m glad to help get newbies started.



  • Not really. Lemmy.world is alright as long as you have two functioning braincells to read community guidelines and understand you can’t just say whatever you want in a large fairly moderated public space.

    You can get away with being an anarchist edgelord in a small instance but on Lemmy.world calls to violence are a no-no. When you get as large and stable as .world you need stricter rules to cover ass legally which is understandable.

    The people who complain about being instance banned usually leave out key information that makes them look bad or just plain lie about why they were banned ,cwhich you can check if you go into modlogs.

    Besides that, theres always going to be power mod fuckery any internet fourm you go. At least you can at least post it to c/yepoweryrippingbastard to vent.