I've been keeping a directory of screenshots of poor DDG results for the past couple years and there have been many conspicuously broken results. It's a damn shame since 2+ years ago DDG was better.
Some examples:
- Double quoted strings ignored, continuing to return results without the quoted string.
- Adding the exclusion parameter doing nothing, continuing to return results with the string.
- Various queries return non-legit sites in the majority top 10 results or more—sometimes being all results. By non-legit, meaning: these are randomized named domains that have scraped other sites' content or are doing word salad query matching. Example query from December last year: `google "oauth" "api" "thunderbird"` returned all non-legit sites.
- Using `site:` parameter to restrict a query to specific site can return fewer results for that domain (eg: two results vs dozens) than just searching for the site name and query without the `site:` parameter.
- Using `site:` parameter to limit to TLD only sometimes returns only single result or nothing despite knowing there are domains that contain the query. Eg: last year when searching `opnsense site:.se` there should have been tons of results from teklager.se but there were none (I have the screenshots), but searching `opnsense site:teklager.se` returned results.
In the most egregious cases that I've documented I submit feedback but it always seemed like a hopeless cause the worse the results became.
> In the most egregious cases that I've documented I submit feedback but it always seemed like a hopeless cause the worse the results became.
I once had a Ducker(? Goer?) reply to me here on HN saying that a human does in fact read the feedback, and yet this indescribably terrible change aligns with your conclusion: they read them, and then cheerfully "put them on the backlog"
Some examples:
- Double quoted strings ignored, continuing to return results without the quoted string.
- Adding the exclusion parameter doing nothing, continuing to return results with the string.
- Various queries return non-legit sites in the majority top 10 results or more—sometimes being all results. By non-legit, meaning: these are randomized named domains that have scraped other sites' content or are doing word salad query matching. Example query from December last year: `google "oauth" "api" "thunderbird"` returned all non-legit sites.
- Using `site:` parameter to restrict a query to specific site can return fewer results for that domain (eg: two results vs dozens) than just searching for the site name and query without the `site:` parameter.
- Using `site:` parameter to limit to TLD only sometimes returns only single result or nothing despite knowing there are domains that contain the query. Eg: last year when searching `opnsense site:.se` there should have been tons of results from teklager.se but there were none (I have the screenshots), but searching `opnsense site:teklager.se` returned results.
In the most egregious cases that I've documented I submit feedback but it always seemed like a hopeless cause the worse the results became.