I suspect that this is the direct result of AI generated content just overwhelming any real content.
I tried ddg, google, bing, quant, and none of them really help me find information I want these days.
Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts
ETA: someone suggested SearXNG and after using it a bit it seems to be much better compared to ddg and the rest.
I don’t honestly even remember the last time I’ve googled something. Nowdays I’ll just ask chatGPT
The problem with getting answers from AI is that if they don’t know something, they’ll just make it up.
“If I have to create stories so that the American media actually pays attention to the suffering of the American people, then that’s what I’m going to do.”
Sounds an awful lot like some coworkers
LLMs have their flaws but for my use it’s usually good enough. It’s rarely mission critical information that I’m looking for. It satisfies my thirst for an answer and even if it’s wrong I’m probably going to forget it in a few hours anyway. If it’s something important I’ll start with chatGPT and then fact check it by looking up the information myself.
So, let me get this straight…you “thirst for an answer”, but you don’t care whether or not the answer is correct?
this is like addiction to youtube “top 10 facts” and whatever similar videos
Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.
I agree that AI can be helpful for bouncing ideas off of. It’s been a great aid in learning, too. However, when I’m using it to help me learn programming, for example, I can run the code and see whether or not it works.
I’m automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.