I suspect that this is the direct result of AI generated content just overwhelming any real content.
I tried ddg, google, bing, quant, and none of them really help me find information I want these days.
Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts
ETA: someone suggested SearXNG and after using it a bit it seems to be much better compared to ddg and the rest.
Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.
I agree that AI can be helpful for bouncing ideas off of. It’s been a great aid in learning, too. However, when I’m using it to help me learn programming, for example, I can run the code and see whether or not it works.
I’m automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.