When Mani Devi, an Accredited Social Well being Activist (ASHA) in rural Rajasthan, noticed the underweight toddler, she knew one thing was unsuitable—however not how critical it is perhaps, or what recommendation to present.
So she reached for her cellphone and opened WhatsApp: In Hindi, she typed a query to a brand new device known as ASHABot: What’s the best weight for a child this age?
The chatbot—educated in Hindi, English, and a hybrid generally known as Hinglish—responded inside seconds: a child that age ought to weigh round 4 to five kilograms. This one weighed much less.
The bot’s reply was clear and particular. It inspired feeding the child eight to 10 occasions a day, and it defined how one can counsel the mom with out inflicting alarm.
That, she mentioned, was one of many many encounters with ASHABot that modified the best way she does her job.
The device is a part of a quiet however vital shift in public well being, one which blends cutting-edge synthetic intelligence with on-the-ground realities in a few of India’s most underserved communities.
ASHABot, launched in early 2024, is what occurs when a generative AI mannequin akin to OpenAI’s ChatGPT or GPT-4 just isn’t solely educated on the broader web, however is related to a data base containing India’s public well being manuals, immunization pointers, and household planning protocols. It takes voice notes when prompted and supplies solutions that assist the ASHAs serve sufferers.
Constructed by the nonprofit Khushi Child (opens in new tab) utilizing know-how developed and open sourced by Microsoft Analysis, the bot has been reworking how a few of the nation’s ASHA employees do their jobs. These girls are the glue between India’s rural households and the well being system, answerable for every little thing from vaccination data to childbirth counseling. However they obtain simply 23 days of primary coaching and sometimes work in settings the place docs are distant, supervisors are overburdened, and even cell sign is unreliable.
“ASHAs have at all times been on the entrance strains,” mentioned Ruchit Nagar, co-founder and CEO of Khushi Child and a Harvard-trained doctor. “However they haven’t at all times had the instruments.”
Nagar’s relationship with ASHAs goes again almost a decade. In 2015, he launched Khushi Child with the objective of digitizing well being knowledge in underserved communities, usually designing tech techniques that had been regionally grounded. The thought of ASHABot emerged in late 2023, throughout a summit with stakeholders in Rajasthan.
On the time, Khushi Child was working with Microsoft Analysis on a separate AI mission—one which used eye pictures to detect anemia. However the buzz round giant language fashions, particularly ChatGPT, was rising quick. Nagar and his collaborators started to ask whether or not this know-how may assist ASHAs, who usually lacked real-time entry to high quality, comprehensible, medically sound steerage.
“ASHAs had been already utilizing WhatsApp and YouTube. We noticed an inflection level, new digital customers prepared for one thing extra,” mentioned Nagar, now a resident on the Yale Faculty of Drugs in New Haven, Conn.
So that they started constructing.
Microsoft researcher Pragnya Ramjee joined the mission round that point, leaving a design job at a hedge fund to give attention to know-how with social affect. With a background in human-centered design, she helped lead the qualitative analysis, interviewing ASHAs in Rajasthan alongside a educated translator.
“It made an enormous distinction that the translator and I had been girls,” she mentioned. “The ASHAs felt extra comfy being open with us, particularly about delicate points like contraception or gender-based violence.”

Ramjee and the staff helped fine-tune the system in collaboration with docs and public well being consultants. The mannequin, primarily based on GPT-4, was educated to be extremely correct. When it receives a query, it consults a rigorously curated database—round 40 paperwork from the Indian authorities, UNICEF, and different well being our bodies. If the bot doesn’t discover a clear reply, it doesn’t guess. As a substitute, it forwards the query to a small group of nurses, whose responses are then synthesized by the mannequin and returned to the ASHA inside hours.
The objective, Ramjee mentioned, is to make sure the bot at all times stays grounded in actuality and in the true coaching ASHAs obtain.
Up to now, greater than 24,000 messages have been despatched by the system and 869 ASHAs have been onboarded. Some employees have used it solely a few times. Others ship as much as 20 messages in a single day. Matters vary from the anticipated—childhood immunization schedules, breastfeeding finest practices—to the surprising.
“They’re asking about contraception, about little one marriage, about what to do if there’s a battle within the household,” Ramjee mentioned. “These aren’t simply medical questions. They’re social questions.”

One lady got here to Mani Devi saying she’d missed her interval for 2 months however wasn’t pregnant. The bot supplied Devi with info that gave her the boldness to guarantee the affected person she had nothing to fret about.
The responses are available each textual content and voice be aware, the latter usually performed aloud by ASHAs for the affected person to listen to. In some circumstances, voice responses about long-acting contraception assist persuade hesitant girls to start remedy.
There isn’t a query the know-how works. However the staff is fast to emphasise that it doesn’t change human data. As a substitute, it amplifies it. ASHABot illustrates how LLM-powered chatbots will help bridge the knowledge hole for folks, significantly these with restricted entry to formal coaching and know-how, mentioned Mohit Jain, principal researcher at Microsoft Analysis India.
“There may be numerous debate about whether or not LLMs are a boon or a bane,” Jain mentioned. “I consider it’s as much as us to design and deploy them responsibly, in ways in which unlock their potential for actual societal profit. ASHABot is one instance of how that’s potential.”
– Mohit Jain, Principal Researcher, Microsoft Analysis India

After all, the chatbot isn’t good. Some customers nonetheless want to name folks they know, and the massive query of scaling stays. The staff is exploring personalization choices, multimodal assist like picture inputs, and parallel LLM brokers to make sure high quality assurance at scale.
Nonetheless, the imaginative and prescient is expansive. As of now, ASHABot is barely utilized in Udaipur, one of many 50 districts in Rajasthan. The long-term objective is to carry ASHABot to all a million ASHAs throughout the nation, who handle about 800 to 900 million folks in rural India. The potential ripple impact throughout maternal well being, vaccination, and illness surveillance is immense.
Nagar, who has traveled to India twice yearly for the final 10 years to analysis the wants of ASHAs, mentioned there are nonetheless “many issues but to discover, and lots of massive inquiries to reply.”
For ASHAs like Mani Devi, the shift is already actual. She says she feels extra knowledgeable, extra assured. She will discuss beforehand taboo topics, as a result of the bot helps her break the silence.
“Total, I can provide higher info to individuals who need assistance,” she mentioned. “I can ask it something.”