AI and digital health tools promise smarter patient support — but where do you actually start?
We sat down with Amos Adler, President & Founder of MEMOTEXT, and Kimberley Cody-Fougere, Director of Offerings and Marketing at STI Technologies. Together, they explore where data and AI can create real value — and why partnership is key to shaping the future of patient support.
Patient needs have changed. “We thought patients needed more information and reminders,” says Amos "What they really need is relevance — evidence-informed content delivered at the right time, in the right way. And co-design matters — programs work best when patients, clinical partners, and commercial teams build them together.”
“It’s about meeting patients where they are,” Kimberley adds, “giving them options, and moving beyond one-size-fits-all solutions. Before Covid-19, digital engagement faced resistance. The pandemic changed that — now it’s expected.”
Understanding patient needs is only part of the picture — the challenge is delivering them in a way that feels personal and relatable.
How do you make digital tools feel human?
Amos:
Co-creation, cultural relevance and personalization. We treat every message as a conversation. Discretion is also really important when engaging with people, especially around their health. Not everyone wants others to know what they’re going through, so it’s crucial to build in subtlety.
Part of it is just creating engagement that doesn’t suck haha! On top of that, it needs to be something people can actually relate to — not clinical or cold.
Kimberley:
I’ll echo what Amos said. To create a truly meaningful experience that genuinely supports patients, it's essential to understand their individual therapeutic journey, lived experiences, pain points, and behaviors.
The “spray and pray” tactic — sending blanket messages to all patients — just becomes white noise. Health literacy is another crucial piece. Medical language can be overwhelming, and every patient processes it differently. Meeting people at the right level of understanding is key — and it’s where AI and natural language processing can really help.
The Rise of AI in Patient Support Programs: Opportunities and Challenges
Few topics in healthcare generate as much buzz as artificial intelligence, and for every bold claim about its potential, there’s just as much skepticism. In the PSP space, where regulation and patient trust are paramount, the challenge is figuring out where AI can make a safe and meaningful difference.
There’s a lot of hype around AI in healthcare and patient support programs – what parts do you see as real opportunities, and what’s just noise?
Amos:
The noise is in one-size-fits-all chatbots or hallucination-prone LLMs pretending to be clinical and engaging.
We’ve identified three big opportunities in low-risk areas:
- Personalization and segmentation using data
- Augmenting and scaling care within patient support workflows from a patient rep or clinical support perspective
- Operational uses of LLMs – things like question answering and care coordination conversations rather than anything that requires clinical validation.
These are the lowest-hanging fruit in healthcare for patient engagement because healthcare is so highly regulated – any clinical application requires extensive validation. That’s why we focus on AI in supportive rather than prescriptive ways — always with a human in the loop. It’s really AI as a co-pilot, not an automated robot. We’re not there yet in healthcare.
For someone new to this space, what does using AI and data to improve patient journeys and support actually look like in practice, and what goes into it?
Amos:
AI is really good at predicting things. Imagine knowing the risk of drop off, relapse, or complexity before it happens. That typically relies on historical data.
Predictive algorithms use your own behavior and data, or that of individuals and populations with similar patterns. Much of this involves time series analysis — looking at long stretches of health data to use past behavior to predict future outcomes. With enough longitudinal data, you can identify moments in the data that signal:
- Non-adherence
- Likelihood of switching medications
- Adding new prescriptions
Claims data and refill data are common inputs here. We sometimes call this ‘predictalytics’ — using analytics to anticipate health-related changes. A simple way to think about it is like Netflix: it recommends shows based on your viewing history. In healthcare, the same principle applies but using large sets of health data to identify future patterns.
Kimberley:
This is one of the complexities Amos and I have talked about when it comes to implementing AI in patient support programs — it all comes back to the data.
You need data sets that are robust, clean, and structured enough to run meaningful analyses. That means you can’t just ‘lift and shift’ AI models into PSPs that were designed a decade ago without that foresight. Programs need to be built with the future in mind, because without the right data, the potential of AI is limited.
What should organizations keep in mind – and what questions should they be asking – before bringing AI into their solutions?
Amos:
The same questions that have always applied in healthcare still apply with AI – it’s just another tool. A powerful one, but it’s still meant to support your clinically driven business priorities.
Start with the problem: what are you actually trying to solve, and are vendors dictating your priorities? From there, think about governance, transparency, safety, bias, safeguards, interoperability – and always ask, do we have the data, and permission, to learn from it? Finally, how will this support or burden clinicians and patients?
Kimberley:
Some of the first real opportunities for AI in PSPs are in areas where structured data already exists — like telephony systems or CRMs that have been capturing information for decades.
We’re already seeing AI solutions integrated into service models, like those used by telephony providers to analyze call data and enhance caseworker responses in real-time. These insights can be easily applied within PSPs today, since that data is already systematically captured.
Creating therapeutic agnostic tools like chatbots can significantly enhance patient engagement by offering real-time assistance and guidance, making healthcare interactions more accessible and responsive. Beyond this, more work is still needed for AI applications focused on patient-specific therapeutic journeys and desired outcomes.
One of the big complexities for PSPs is being mindful not just of the data itself, but also of consent. Ultimately, success with AI in PSPs will require building programs with data — and patient consent — front and center.
Untangling The Data: Data Strategies in Patient Support Programs
Behind every AI conversation lurks the same sticking point: data. What counts as “good” data? How much of it do you really need? And how do you gather it responsibly, without drowning in information or compromising patient trust?
Data seems to trip a lot of people up. Teams want AI, but they’re not sure what data they actually need — what’s your advice?
Amos:
People are both overwhelmed and excited by AI. The key thing to understand is that your data strategy should always be driven by the problem you’re trying to solve, not collecting data and then searching for a problem to match it to.
We focus on setting the stage with a broad and comprehensive data strategy from the start. That means building in ethical, consent-driven data collection throughout patient and customer interactions and combining that with data cleaning and tagging so it can be used later. AI can answer key questions, but it also sparks new ones — often leading you further down the rabbit hole of insight.
Kimberley:
A lot of organizations don’t always know what the right problem statements are — and that’s where collaboration can be so valuable. Working together, we can identify problem statements and use cases — even when clients may not see them themselves.
Another important point is that you should only collect data you’ll actually use. Pulling massive amounts of data without clear use cases doesn’t align with best practices. By coming to the table with well-defined problem statements, we can guide clients to structure their PSPs to capture the right data and then apply that data to help solve real challenges.
Building Stronger Partnerships
Even the best tools won’t succeed in isolation. As PSPs become more complex, no single organization can cover every angle on their own. The future will depend on collaborations that combine technological expertise, clinical experience, and patient perspective.
How do you see partnerships between technology providers, healthcare organizations, and patients evolving to meet those future expectations – and what do you think makes a strong partnership?
Amos:
The old vendor-client model is changing. We’re moving toward co-creation consortia. Patients, consumer tech, and clinicians will be at the table together.
Being strategically aligned within your partnerships is important, and coming from similar backgrounds in terms of history and experience helps. We’ve known STI for a long time, and there’s a mutual understanding of the space. Both groups are highly capable, no-nonsense, straightforward and reliable. That matters today.
Kimberley:
To meet the evolving demands of healthcare and PSPs, providers must recognize their strengths and limitations.
We’ve heard multiple clients say they don’t want to keep reinventing the wheel—they want a succinct solution. What we’re doing with STI and MEMOTEXT is a great example: closing the gap for clients, reducing the number of vendors they need to engage with, and leaning on subject matter expertise where it exists.
Effective communication, consistency, and transparency are crucial. You want your partner to know where you really are, your limitations and capabilities. I’d much rather someone be honest than try to sell me something half-baked. That honesty builds trust and accountability.