Not defending this, but it’s annoying because Google and all search engines results are being poisoned by AI written slop. It seems like LLMs may provide a better search experience, but it’s also the thing ruining the search experience.
I don’t really know what I’m talking about, but I imagine if AI slop is ruining search, it will also start to ruin itself when the current slop is used to train future LLMs. Basically I think AI will short circuit itself long term.
Now, will it short circuit itself enough for Microsoft to stop shoving it down our throats, probably not.
Yeah I do realize the inevitable problem when their sources dry up because no one is communicating anymore but for the quick questions about how something works in the world it’s extremely convenient. I’d just be asking Google anyway.
But you can’t see the source of the information, which means it could be a reputable source, or it could be Joe-Sucks-His-Own-Dick from Reddit. In another comment, I pointed out that AI was telling people to put glue on pizza to keep the cheese from falling off—if you can see the source, you are much more likely to understand the veracity of the information.
Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.
If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.
You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.
If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.
Talk about being part of the problem.
Not defending this, but it’s annoying because Google and all search engines results are being poisoned by AI written slop. It seems like LLMs may provide a better search experience, but it’s also the thing ruining the search experience.
I don’t really know what I’m talking about, but I imagine if AI slop is ruining search, it will also start to ruin itself when the current slop is used to train future LLMs. Basically I think AI will short circuit itself long term.
Now, will it short circuit itself enough for Microsoft to stop shoving it down our throats, probably not.
Yeah I do realize the inevitable problem when their sources dry up because no one is communicating anymore but for the quick questions about how something works in the world it’s extremely convenient. I’d just be asking Google anyway.
But you can’t see the source of the information, which means it could be a reputable source, or it could be Joe-Sucks-His-Own-Dick from Reddit. In another comment, I pointed out that AI was telling people to put glue on pizza to keep the cheese from falling off—if you can see the source, you are much more likely to understand the veracity of the information.
This is such a silly take.
Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.
If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.
The silly take is that using it is “part of the problem”.
Also, the glue on pizza thing is nearly a moot point. The models are much more advanced now and will continue to be.
The commercial LLM’s can share their sources now so that’s also a moot point.
It’s not going away. Learn to use it effectively.
You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.
Do you ask for the sources every time?
If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.
Good, then it is a bit less of a bad tool in this instance. Just don’t lose the habit of checking your sources—it’s a slippery slope.