Some experts say Google is just parroting your own beliefs right back to you. It may be worsening your own biases and deepening societal divides along the way.

[…]

“Google’s whole mission is to give people the information that they want, but sometimes the information that people think they want isn’t actually the most useful,” says Sarah Presch, digital marketing director at Dragon Metrics, a platform that helps companies tune their websites for better recognition from Google using methods known as “search engine optimisation” or SEO.

[…]

“> What Google has done is they’ve pulled bits out of the text based on what people are searching for and fed them what they want to read” – Sarah Presch

  • lily33@lemm.ee
    link
    fedilink
    arrow-up
    23
    ·
    edit-2
    7 hours ago

    Type in "Is Kamala Harris a good Democratic candidate

    …and any good search engine will find results containing keywords such as “Kamala Harris”, “Democratic”, “candidate”, and “good”.

    […] you might ask if she’s a “bad” Democratic candidate instead

    In that case, of course the search engine will find results containing keywords such as “Kamala Harris”, “Democratic”, “candidate”, and “bad”.

    So the whole premise that, “Fundamentally, that’s an identical question” is just bullshit when it comes to searching. Obviously, when you put in the keyword “good”, you’ll find articles containing “good”, and if you put in the keyword “bad”, you’ll find articles containing “bad” instead.

    Google will find things that match the keywords that you put in. So does DuckDuckGo, Qwant, Yahoo, whatever. That is what a good search engine is supposed to do.

    I can assure you, when search engines stop doing that, and instead try to give “balanced” results, according to whatever opaque criteria for “balanced” their company comes up with, that will be the real problem.

    I don’t like Google, and only use google when other search engines fail. But this article is BS.

    • Drew@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Ah but other than the search results there’s also a big AI summary on the top, which I’m more concerned about

      • Handles@leminal.space
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Yeah. Be very, very afraid of people using search engines or “AI” as some Magic Eightball oracle to give them answers.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    12
    ·
    12 hours ago

    The Featured Snippet quoted an article from the Mayo Clinic, highlighting the words “Caffeine may cause a short, but dramatic increase in your blood pressure.” But when she looked up “no link between coffee and hypertension”, the Featured Snippet cited a contradictory line from the very same Mayo Clinic article: “Caffeine doesn’t have a long-term effect on blood pressure and is not linked with a higher risk of high blood pressure”.

    On the one hand, Google sucks. On the other hand, if people are unable to a) understand how those two snippets are not contradictory, and b) read at least one very short simplified-for-laymen Mayo Clinic article about the topic before thinking they’ve learned anything at all about medicine, it’s hard to see the problem as being primarily due to Google. There is something deeper, and worse, going wrong when people habitually take that kind of extreme shortcut to thinking that they know the right answer about almost anything, and it has little to do with whether any one-sentence snippets they’re given are biased or accurate.

    • ranandtoldthat@beehaw.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 hours ago

      Yes, but this issue is not one we should want Google solving. We need better media literacy education throughout life.

      • watson387@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        7 hours ago

        I don’t disagree with you. I’m saying Google’s algorithm is part of the cause not the cure.