How AI is reshaping physiotherapy - and how physiotherapy can shape AI

One of the most energised sessions at the World Physiotherapy Congress 2025, 29–31 May, didn’t take place in a lecture theatre—it unfolded in the open Idobata space, inspired by the Japanese tradition of informal “well-side chats.” A standing-room-only crowd gathered tightly ahead of the highly anticipated and dynamic exchange on artificial intelligence (AI) and assistive technologies in physiotherapy.

The session was more than a conversation about tools—it was a global reality check. Clinicians, researchers and educators shared not just what’s possible with AI, but what’s actually happening, where it’s falling short, and how physiotherapy must lead its responsible evolution.
Christopher Lo, a clinician-researcher from Singapore, described using AI tools such as ChatGPT and DeepSeek to automate labour-intensive tasks in systematic reviews and policy drafting.

“Ten articles used to take a week,” said Christopher. “Now, half a day—and then I drink coffee.”

Other participants echoed this shift, reporting uses from assisting with research appraisals to supporting patient education and even differential diagnosis based on patient histories.

But alongside the enthusiasm came caution. The issue of AI “hallucinations”—where fabricated references and inaccurate outputs undermine credibility—was a recurring theme. One participant from Pakistan shared that using a detailed “persona” prompt (e.g. “You are a senior MSK researcher…”) improved results, but even so, everything still needed to be fact-checked manually.

The consensus: AI can increase efficiency, but it won’t replace clinical reasoning—or rigour.

Rodrigo Rizzo, from Neuroscience Research Australia, shifted the focus to clinical care. His research explores how AI-powered conversational agents could support people with persistent pain.

“We broke primary care into three phases—before, during and after the consultation,” he explained. “Each one has its own communication gaps.”

Patients often arrive with misconceptions. Clinicians may not have time to share nuanced evidence. Afterward, questions go unanswered. Conversational agents, Rodrigo suggested, could help fill these gaps by providing tailored, evidence-informed guidance that supports self-management and engagement.

But he also asked: “What would you need to use these tools confidently?” The answers from the room were clear—stronger digital literacy, clinician-led design, and frameworks to ensure safety, accessibility and trust.

While AI was the centrepiece, Hiroto Hayashi shifted the lens to assistive technology—and the hidden barriers to adoption even in so-called ‘tech-forward’ nations.

“We make amazing rehab tech in Japan—wearables, exoskeletons, mobility tools,” said Hiroto. “But many are classified as welfare equipment here. That limits how they’re used.”

Hiroto works across Japan and low-resource communities in Asia. In some of those lower-resourced settings, he said, access to assistive technology is actually easier than at home. Classification as “non-medical” affects funding, perception and availability. Participants offered practical solutions—reframing devices as medical through global examples, showing policymakers their clinical relevance, and advocating for reclassification. The tension was clear: innovation is not the same as implementation.

While much of the session focused on opportunity and curiosity, it also tackled harder questions—particularly around equity, data and representation.

An Australian participant raised environmental concerns, urging participants to consider the water and energy demands of large models and to avoid uploading sensitive data without safeguards.

Rowena Williams, a physiotherapy educator working across the UK and UAE, reinforced the ethical risks—highlighting the absolute breaches that occur when clinicians enter patient data into AI systems without clear protocols and the dangers of relying on Western datasets and AI is useful because it can give us all a voice,” said Rowena. “But right now, the global North dominates.”

She pointed to the Chartered Society of Physiotherapy’s new framework on AI and digital technologies as an example of the profession stepping up to lead ethical implementation. But guidelines alone, she warned, won’t be enough—clinicians must remain engaged in how these tools are designed, governed and applied in practice.

“We’re tired. Burnout is real,” she said. “But we have a chance to use these technologies—to work more efficiently, more ethically, and with better representation.”

Looking around the room, Rowena added, “In this space, we have the world.” It was both a reminder and a call to action: that equity in AI begins with who’s in the room—and who gets heard.

Her message resonated: physiotherapists shouldn’t just adapt to AI—they must take an active role in shaping how it’s used, who it serves and what values guide its evolution.

There were no easy conclusions—but there was clarity. AI and assistive technologies are already part of physiotherapy. The task now is to ensure their integration is guided by ethics, equity and evidence—not just efficiency.

“This isn’t about replacing care,” said one participant. “It’s about reimagining how we deliver it—together.”

Back to the listing