How so? Educate us!
How so? Educate us!
And? Blatant lies are not exclusive to AI texts. Every right wing media is full of blatant lies, yet are written by humans (for now).
The problem is, if you properly prompt the AI, you get exactly what you want. Prompt it a hundred times, and you get a hundred different texts, posted to a hundred different social media channels, generating hype. How in earth will you be able to detect this?
I’m genuinely curious. Is it feasible that they maintain their own chromium forks, or will the work become too much if Google keeps inserting more and more crap into it?
Pretending to have committed a crime is a crime itself
Gotta listen to some Götz Widmann haha.
You are only truly anonymous if you always use a VPN or Tor. If not, Reddit has your IP and the ISP knows who is behind the IP. If LE knocks at Reddit’s door with a warrant, they will give them your IP, with which they go to the ISP to get your name.
they’re an underwater welder from a specific small town and they have three sons
You would be suprised of how much less info than that is needed to ID a person. There are studies about ID’ing people via their favorites and last-watched lists on netflix.
I like it! Main issue for me is that there is not enough content on my hobbies, and “all” content is mostly filled with reddit-this and lemmy-that (or now threads) stuff, which is annoying because I don’t want to talk more about the platform than actually using it. But I hope this will change with some time.
I use only the browser, UX and UI is pretty straight forward, but subscribing to communities of other instances is really weird. I need to copy the “handle” (i.e. !lemmy_support@lemmy.ml), and add it manually to my instance domain (i.e. lemmy.world/c/lemmy_support@lemmy.ml), and then I subscribe to it. I don’t know if there are other ways (besides finding new communities via “all”).
I’m not into the technicals of lemmy or the fediverse, but I guess this is not easily solvable, as an instance doesn’t know that I am the user of another instance.
Googlebot does execute Javascript, but since rendering JS needs much more resources, JS crawling will happen significantly less then simple http crawling. That’s why all big sites still return server side rendered content.
So, how much do you spend, why are you doing it, and do you get any funding or paying this or of your pocket? (just trying to understand how the fediverse works)
Thing is, if this takes off and websites adopt it, FF will be forced to integrate it aswell. I’d be fine with some websites not working in FF, but my mother will call me and say “the internet is broken”. I guess Mozilla doesn’t want and/or cannot afford that.