My main focus isn’t on quality of results.
The real problem is, whenever I type a query that’s even the slightest bit out of the ordinary, I get no results. Google, Duckduckgo, Startpage, Searx, all of them.
It strains credibility that these advanced pieces of technology could not find anything among billions of websites, when in the past, less-advanced versions of these engines could find results for similar queries among smaller numbers of websites.
To reiterate: My problem is not with SEO, or spam, or AI-generated websites, or irrelevant results. My problem is getting no results much more frequently than in the past.
I don’t have official information either, but for a potential explanation which doesn’t sound like a conspiracy theory:
In my experience, search engines dropped off in quality rapidly in the first few months after LLMs (ChatGPT and such) became publicly available. LLMs made it trivial to craft plausible-looking webpages and there’s a financial incentive to do so, because you can put ads on the webpage and get paid for any unfortunate visitor. The result is tons of spam webpages
Search engines had to deal with those spam webpages, i.e. filter them out, otherwise the results would’ve been even more useless. But that’s where the problem comes in: Because LLMs are specifically built to imitate human language, it’s extremely difficult to filter for them. This means that lots of non-LLM-generated webpages would get caught up in these filters, too.
In particular, I’m guessing, they’re aggressively filtering webpages which aren’t widely known, which could be either a new spam webpage or it could be that niche blog with the answer you’re looking for. They can’t really discern between the two, so both get filtered out.
That seems like the correct answer, since the spate of no-results pages coincided with the rise of the AI-pocalypse.