{"id":5449,"date":"2025-05-02T05:22:00","date_gmt":"2025-05-02T05:22:00","guid":{"rendered":"https:\/\/medical-article.com\/?p=5449"},"modified":"2025-05-02T05:22:00","modified_gmt":"2025-05-02T05:22:00","slug":"dr-google-starts-sharing-regular-folks-advice-as-chatbots-loom","status":"publish","type":"post","link":"https:\/\/medical-article.com\/?p=5449","title":{"rendered":"Dr. Google Starts Sharing Regular Folks\u2019 Advice As Chatbots Loom"},"content":{"rendered":"<div class=\"wp-block-image\">\n<\/div>\n<p>By MICHAEL MILLENSON<\/p>\n<p>\u201cDr. Google,\u201d the nickname for the search engine that answers hundreds of millions of health questions every day, has begun including advice from the general public in some of its answers. The \u201cWhat People Suggest\u201d feature, presented as a response to user demand, comes at a pivotal point for traditional web search amid the growing popularity of artificial intelligence-enabled chatbots such as ChatGPT.<\/p>\n<p>The <a href=\"https:\/\/blog.google\/technology\/health\/the-check-up-health-ai-updates-2025\/\">new feature<\/a>, currently available only to U.S. mobile users, is populated with content culled, analyzed and filtered from online discussions at sites such as Reddit, Quora and X. Though Google says the information will be \u201ccredible and relevant,\u201d an obvious concern is whether an algorithm whose raw material is online opinion could end up as a global super-spreader of misinformation that\u2019s wrong or even dangerous. What happens if someone is searching for alternative treatments for cancer or wondering whether vitamin A can prevent measles?<\/p>\n<p>In a wide-ranging interview, I posed those and other questions to <a href=\"https:\/\/research.google\/people\/michaeldhowellmdmph\/?&amp;type=google\">Dr. Michael Howell<\/a>, Google\u2019s chief clinical officer. Howell explained why Google initiated the feature and how the company intends to ensure its helpfulness and accuracy. Although he framed the feature within the context of the company\u2019s long-standing mission to \u201corganize the world\u2019s information and make it universally accessible and useful,\u201d the increasing competitive pressure on Google Search in the artificial intelligence era, particularly for a topic that generates billions of dollars in Search-related revenue from sponsored links and ads, hovered inescapably in the background.<\/p>\n<h2 class=\"wp-block-heading\"><em>Weeding Out Harm<\/em><\/h2>\n<p>Howell joined Google in 2017 from University of Chicago Medicine, where he served as chief quality officer. Before that, he was a rising star at the Harvard system thanks to his work as both researcher and front-lines leader in using the science of health care delivery to improve care quality and safety. When Howell speaks of consumer searches related to chronic conditions like diabetes and asthma or more serious issues such as blood clots in the lung \u2013 he\u2019s a pulmonologist and intensivist \u2013 he does so with the passion of a patient care veteran and someone who\u2019s served as a resource when illness strikes friends and family.<\/p>\n<p>\u201cPeople want authoritative information, but they also want the lived experience of other people,\u201d Howell said. \u201cWe want to help them find that information as easily as possible.\u201d<\/p>\n<p>He added, \u201cIt\u2019s a mistake to say that the only thing we should do to help people find high-quality information is to weed out misinformation. Think about making a garden. If all you did was weed things, you\u2019d have a patch of dirt.\u201d<\/p>\n<p>That\u2019s true, but it\u2019s also true that if you do a poor job of weeding, the weeds that remain can harm or even kill your plants. And the stakes involved in weeding out bad health information and helping good advice flourish are far higher than in horticulture.<\/p>\n<p>Google\u2019s weeder wielding work starts with digging out those who shouldn\u2019t see the feature in the first place. Even for U.S. mobile users, the target of the initial rollout, not every query will prompt a What People Suggest response. The information has to be judged helpful and safe.<\/p>\n<p>If someone\u2019s looking for answers about a heart attack, for example, the feature doesn\u2019t trigger, since it could be an emergency situation. <\/p>\n<p><span><\/span><\/p>\n<p>What the user will see, however, is what\u2019s typically displayed high up in health searches; i.e., authoritative information from sources such as the Mayo Clinic or the American Heart Association. Ask about suicide, and in America the top result will be the 988 Suicide and Crisis Lifeline, linked to text or chat as well as showing a phone number. Also out of bounds are people\u2019s suggestions about prescription drugs or a medically prescribed intervention such as preoperative care.<\/p>\n<p>When the feature does trigger, there are other built-in filters. AI has been key, said Howell, adding, \u201cWe couldn\u2019t have done this thee years ago. It wouldn\u2019t have worked.\u201d<\/p>\n<p>Google deploys its Gemini AI model to scan hundreds of online forums, conversations and communities, including Quora, Reddit and X, gather suggestions from people who\u2019ve been coping with a particular condition and then sort them into relevant themes. A custom-built Gemini application assesses whether a claim is likely to be helpful or contradicts medical consensus and could be harmful. It\u2019s a vetting process deliberately designed to avoid amplifying advice like vitamin A for measles or <a href=\"https:\/\/www.nytimes.com\/2025\/03\/29\/opinion\/medical-freedom-cancer-rfk.html\">dubious cancer cures<\/a>.<\/p>\n<p>As an extra safety check before the feature went live, samples of the model\u2019s responses were assessed for accuracy and helpfulness by panels of physicians assembled by a third-party contractor.<\/p>\n<h2 class=\"wp-block-heading\"><em>Dr. Google Listens to Patients<\/em><\/h2>\n<p>Recommendations that survive the screening process are presented as brief What People Suggest descriptions in the form of links inside a boxed, table-of-contents format within Search. The feature isn\u2019t part of the top menu bar for results, but requires scrolling down to access. The presentation \u2013 not paragraphs of response, but short menu items \u2013 emerged out of extensive consumer testing.<\/p>\n<p>\u201cWe want to help people find the right information at the right time,\u201d Howell said. There\u2019s also a feedback button allowing consumers to indicate whether an option was helpful or not or was incorrect in some way.<\/p>\n<p>In Howell\u2019s view, What People Suggest capitalizes on the \u201clived experience\u201d of people being \u201cincredibly smart\u201d in how they cope with illness. As an example, he pulled up the What People Suggest screen for the skin condition eczema. One recommendation for alleviating the symptom of irritating itching was \u201ccolloidal oatmeal.\u201d That recommendation from eczema sufferers, Howell quickly showed via Google Scholar, is actually supported by a randomized controlled trial.<\/p>\n<p>It will take surely take time for Google to persuade skeptics. Dr. Danny Sands, an internist, co-founder of the Society for Participatory Medicine and co-author of the book Let Patients Help, told me he\u2019s wary of whether \u201ccommon wisdom\u201d that draws voluminous support online is always wise. \u201cIf you want to really hear what people are saying,\u201d said Sands, \u201cgo to a mature, online support community where bogus stuff gets filtered out from self-correction.\u201d (Disclosure: I\u2019m a longtime SPM member.)<\/p>\n<p>A Google spokesperson said Search crawls the web, and sites can opt in or out of being indexed. She said several \u201crobust patient communities\u201d are being indexed, but she could not comment on every individual site.<\/p>\n<h2 class=\"wp-block-heading\"><em>Chatbots Threaten<\/em><\/h2>\n<p>Howell repeatedly described What People Suggest as a response to users demanding high-quality information on living with a medical condition. Given the importance of Search to Google parent Alphabet (whose name, I\u2019ve noted elsewhere, <a href=\"https:\/\/mlmillenson.medium.com\/does-googles-kabbalah-clout-make-it-divine-bd2c8c3ccf64\">has an interesting kabbalistic interpretation<\/a>), I\u2019m sure that\u2019s true.<\/p>\n<p>Alphabet\u2019s 2024 annual report folds Google Search into \u201cGoogle Search &amp; Other.\u201d It\u2019s a $198 billion, highly profitable category <a href=\"https:\/\/www.visualcapitalist.com\/alphabets-revenue-breakdown-in-2024\/\">that accounts for close to 60% of Alphabet\u2019s revenue<\/a> and includes Search, Gmail, Google Maps, Google Play and other sources. When that unit <a href=\"https:\/\/www.forbes.com\/sites\/dereksaul\/2025\/04\/24\/google-earnings-stock-soars-as-q1-results-shatter-expectations\/\">reported better-than-expected revenues<\/a> in Alphabet\u2019s first-quarter earnings release on April 24, the stock immediately jumped.<\/p>\n<p>Health queries constitute an estimated 5-7% of Google searches, easily adding up to billions of dollars in revenue from sponsored links. Any feature that keeps users returning is important at a time when a federal court\u2019s antitrust verdict threatens the lucrative Search franchise and a prominent AI company <a href=\"https:\/\/www.forbes.com\/sites\/torconstantino\/2025\/04\/24\/if-openai-buys-chrome-ai-may-rule-the-browser-wars\/\">has expressed interest<\/a> in buying Chrome if Google is forced to divest.<\/p>\n<p>The larger question for Google, though, is whether health information seekers will continue to seek answers from even user-popular features like What People Suggest and AI Overview at a time when AI chatbots are becoming increasingly popular. Although Howell asserted that individuals use Google Search and chatbots for different kinds of experiences, anecdote and evidence point to chatbots chasing away some Search business.<\/p>\n<p>Anecdotally, when I tried out several ChatGPT queries on topics likely to trigger What People Suggest, the chatbot did not provide quite as much detailed or useful information; however, it wasn\u2019t that far off. Moreover, I had repeated difficulty triggering What People Suggest even with queries that replicated what Howell had done.<\/p>\n<p>The chatbots, on the other hand, were quick to respond and to do so empathetically. For instance, when I asked ChatGPT, from OpenAI, what it might recommend for my elderly mom with arthritis \u2013 the example used by a Google product manager in the What People Suggest rollout \u2013 the large language model chatbot prefaced its advice with a large dose of emotionally appropriate language. \u201cI\u2019m really sorry to hear about your mom,\u201d ChatGPT wrote. \u201cLiving with arthritis can be tough, both for her and for you as a caregiver or support person.\u201d When I accessed Gemini separately from the terse AI Overview version now built into Search, it, too, took a sympathetic tone, beginning, \u201cThat\u2019s thoughtful of you to consider how to best support your mother with arthritis.\u201d<\/p>\n<p>There are more prominent rumbles of discontent. Echoing common complaints about the clutter of sponsored links and ads, Wall Street Journal <a href=\"https:\/\/www.wsj.com\/tech\/personal-tech\/google-search-chatgpt-perplexity-gemini-6ac749d9\">tech columnist Joanne Stern wrote<\/a> in March, \u201cI quit Google Search for AI \u2013 and I\u2019m not going back.\u201d \u201cGoogle Is Searching For an Answer to ChatGPT,\u201d <a href=\"https:\/\/www.bloomberg.com\/news\/features\/2025-03-24\/google-s-ai-search-overhaul-racing-chatgpt-for-the-web-s-future?cmpid=032525_morningamer&amp;utm_medium=email&amp;utm_source=newsletter&amp;utm_term=250325&amp;utm_campaign=morningamer\">chipped in Bloomberg Businessweek<\/a> around the same time. In late April, a Washington Post op-ed <a href=\"https:\/\/www.washingtonpost.com\/opinions\/2025\/04\/22\/ai-health-care-expert-opinions\/\">took direct aim at<\/a> Google Health, calling AI chatbots \u201cmuch more capable\u201d than \u201cDr. Google.\u201d<\/p>\n<p>When I reached out to pioneering patient activist Gilles Frydman, founder of an early interactive online site for those with cancer, he responded similarly. \u201cWhy would I do a search with Google when I can get such great answers with ChatGPT?\u201d he said.<\/p>\n<p>Perhaps more ominously, in a study involving structured interviews with a diverse group of around 300 participants, two researchers at Northeastern University found \u201ctrust trended higher for chatbots than Search Engine results, regardless of source credibility\u201d and \u201csatisfaction was highest\u201d with a standalone chatbot, rather than a chatbot plus traditional search. Chatbots were valued \u201cfor their concise, time-saving answers.\u201d The <a href=\"https:\/\/programs.sigchi.org\/chi\/2025\/program\/content\/194491\">study abstract<\/a> was shared with me a few days before the paper\u2019s scheduled presentation at an international conference on human factors in computer engineering.<\/p>\n<h2 class=\"wp-block-heading\"><em>Google\u2019s Larger Ambitions<\/em><\/h2>\n<p>Howell\u2019s team of physicians, psychologists, nurses, health economists, clinical trial experts and others interacts with not just Search, but YouTube \u2013 which last year racked up a mind-boggling 200 billion views of health-related videos \u2013 Google Cloud and the AI-oriented Gemini and DeepMind. They\u2019re also part of the larger Google Health effort headed by chief health officer <a href=\"https:\/\/en.wikipedia.org\/wiki\/Karen_DeSalvo\">Dr. Karen DeSalvo<\/a>. DeSalvo is a prominent public health expert who\u2019s held senior positions in federal and state government and academia, as well as serving on the board of a large, publicly held health plan.<\/p>\n<p>In a post last year entitled, \u201c<a href=\"https:\/\/blog.google\/technology\/health\/google-health-strategy-ai\/\">Google\u2019s Vision For a Healthier Future<\/a>,\u201d DeSalvo wrote: \u201cWe have an unprecedented opportunity to reimagine the entire health experience for individuals and the organizations serving them \u2026 through Google\u2019s platforms, products and partnerships.\u201d<\/p>\n<p>I\u2019ll speculate for just a moment how \u201clived experience\u201d information might fit into this reimagination. Google Health encompasses a portfolio of initiatives, from an AI \u201cco-scientist\u201d product for researchers to Fitbit for consumers. With de-identified data or data individual consumers consent to be used, \u201clived experience\u201d information is just a step away from being transformed into what\u2019s called \u201creal world evidence.\u201d If you look at the <a href=\"https:\/\/health.google\/health-research\/\">kind of research<\/a> Google Health already conducts, we\u2019re not far from an AI-informed YouTube video showing up on my Android smartphone in response to my Fitbit data, perhaps with a handy link to a health system that\u2019s a Google clinical and financial partner.<\/p>\n<p>That\u2019s all speculation, of course, which Google unsurprisingly declined to comment upon. More broadly, Google\u2019s call for \u201creimagining the entire health experience\u201d surely resonates with everyone yearning to transform a system that\u2019s too often dysfunctional and detached from those it\u2019s meant to serve. What People Suggest can be seen as a modest step in listening more carefully and systematically to the individual\u2019s voice and needs.<\/p>\n<p>But the coda in DeSalvo\u2019s blog post, \u201cthrough Google\u2019s platforms, products and partnerships,\u201d also sends a linguistic signal. It shows that one of the world\u2019s largest technology companies sees an enormous economic opportunity in what is rightly called \u201cthe most exciting inflection point in health and medicine in generations.\u201d<\/p>\n<p><em>Michael L. Millenson is president of Health Quality Advisors &amp; a regular THCB Contributor. This first appeared in his column <\/em><a href=\"https:\/\/www.forbes.com\/sites\/michaelmillenson\/2025\/04\/28\/dr-google-starts-sharing-regular-folks-advice-as-chatbots-loom\/\"><em>at Forbes<\/em><\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>By MICHAEL MILLENSON \u201cDr. Google,\u201d the nickname for the search engine that answers hundreds of millions of health questions every day, has begun including advice from the general public in some of its answers. The \u201cWhat People Suggest\u201d feature, presented as a response to user demand, comes at a pivotal point for traditional web search&#8230;<\/p>\n","protected":false},"author":0,"featured_media":5448,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-5449","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles"],"_links":{"self":[{"href":"https:\/\/medical-article.com\/index.php?rest_route=\/wp\/v2\/posts\/5449"}],"collection":[{"href":"https:\/\/medical-article.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/medical-article.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/medical-article.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5449"}],"version-history":[{"count":0,"href":"https:\/\/medical-article.com\/index.php?rest_route=\/wp\/v2\/posts\/5449\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/medical-article.com\/index.php?rest_route=\/wp\/v2\/media\/5448"}],"wp:attachment":[{"href":"https:\/\/medical-article.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5449"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/medical-article.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5449"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/medical-article.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5449"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}