A curly Case Study with Wellster Healthtech Group
Two churn studies, two ends of the funnel, one ongoing partnership.

How Europe's leading digital health platform turned its hardest-to-reach customers into its most valuable research signal.
The challenge
Wellster Healthtech Group operates a portfolio of brands tackling categories most people won't discuss over dinner: GLP-1 weight-loss therapy (GoLighter), men's sexual health (GoSpring), women's health (MySummer), and more.
Wellster has long understood something the rest of digital health is still catching up on: in markets where competitive pressure keeps rising and the customer is, by definition, vulnerable, the companies that win are the ones closest to their patients. Continuous, deep customer research isn't a quarterly luxury — it's a competitive moat.
The problem wasn't ambition. It was tooling.
Traditional qualitative research is too slow and too small to keep up with a multi-brand digital health roadmap. Surveys give you what people click, not what they feel. And the new generation of "qual-at-scale" platforms — most of them US-based, most of them enterprise-priced — were built for Fortune 500 marketing teams, not for fast-moving European health brands that want to run ten focused studies a year, not one giant one.
Wellster knew what they wanted to do. They needed a tool that let them actually do it.
The approach: starting where most companies don't dare to look
Wellster came to curly with an unusually sharp first brief: don't talk to our happy customers. Talk to the ones who left.
They commissioned two churn studies, one at each end of the customer journey:
GoSpring — pre-purchase churn: customers who got all the way to checkout and then walked away. What broke?
GoLighter — post-treatment churn: customers who started GLP-1 therapy, saw results, and then cancelled their subscription. Why?
This is, methodologically, the hard mode of customer research. Churned customers are difficult to reach by definition — they've already chosen to disengage with the brand. Sample sizes are smaller, fieldwork takes longer, and the conversation requires more care. Most companies avoid these studies entirely and settle for talking to the people who already love them. Wellster went straight at the uncomfortable question, twice.
That's the point. Lapsed customers are the most honest customers a brand has. They have nothing to gain from politeness.
The output: depth and rigor, on the same dataset
What landed in Wellster's hands wasn't a 60-page deck with three quotes per chapter. It was a working research environment.
For each study, curly delivered:
Real, adaptive voice conversations with each respondent — not a fixed questionnaire, but a genuine dialogue that followed what people actually said and probed where it mattered.
Quantified themes with statistical significance — every conversation analysed structurally, so "customers feel uncertain about safety" becomes "xy% of pre-purchase churn correlates with missing trust signals at checkout, Cramér's V = 0.52."
A live dashboard, not a static report — segment views, univariate and multivariate analyses, the ability to chat with the dataset and pull a fresh angle in seconds.
Downloadable reports, generated from the same underlying data, ready to share with leadership.
Findings that shipped, not findings that filed. Wellster translated the results directly into product changes — including new patient-account features that gave customers the flexibility they'd asked for in the interviews.
The quiet shift here is that the qual-quant tradeoff stops being a tradeoff. Depth and scale, on the same study. But the more important shift is what that makes possible downstream: insights that move directly from interview to roadmap. Wellster used the GoLighter findings to ship concrete product changes — including new flexibility features in the patient account that gave customers exactly the control they'd asked for. Research that ends in a feature, not a footnote.
For panel-based studies — those run on existing customer panels rather than re-engaging churned users — the same setup runs even faster: results in hours, analysis in real time. The Wellster studies were the harder kind. Worth the wait.

"What convinced me: curly turns open-ended conversations into hard data. For the first time, we didn't just have a hunch about what drives our customers — we had the proof. Measurable. Significant. Specific enough to translate directly into a product roadmap built around what our patients actually asked for. That changes how we make decisions."
— Viola Karl, Commercial VP, Wellster Healthtech Group
What two churn studies, side by side, actually tell you
We can't share the full depth of our proprietary findings — that's Wellster's edge, not ours. But the shape of what these studies surface is worth pointing at, because it's instructive for any consumer health business.
Two patterns showed up across both studies. The first: at every patient journey point — such as medical questionnaire, product selection or checkout — patients want concrete trust signals before they take the next step. The second: patients want flexibility - like adjusting delivery dates when going on holidays or changing dosages when experiencing side effects. Wellster turned both patterns into concrete roadmap items, including the patient-account flexibility features mentioned earlier.
Pre-purchase churn and post-treatment churn look like different problems on the surface. They aren't. Both are, at their core, trust questions asked at different moments — can I trust this enough to start? and can I trust this enough to keep going? Run as separate studies, you get two reports. Run them through the same instrument, on the same platform, and you start seeing the customer journey as one continuous trust curve, with specific failure points you can actually fix.
That's the kind of insight that changes what a product team works on next quarter.
Why voice AI fits sensitive categories so well
A pattern keeps showing up in Wellster's work with us: people are remarkably candid with a voice AI. The stigma drops. There's no judgement in the room, no sense of performing for a stranger with a clipboard. Patients talk about their bodies, their budgets, their side effects, their fears about a relapse — the real stuff.
For categories built on discretion — GLP-1, ED, women's health — that candor isn't a soft benefit. It's the difference between research that informs a decision and research that gets filed.
From pilot to ongoing partnership
The first two studies worked well enough that Wellster moved to a recurring engagement with curly — one that gives their teams continuous access to research capacity across the entire portfolio.
That capacity has since expanded into territory Wellster hadn't run at scale before: structured usability and prototype testing. Customers and prospects work through real tasks on new features, and curly follows up with deep interviews to surface what worked, what didn't, and why. Feedback that lands before code ships, not after.
Wellster's research practice didn't change because of curly. They were already a brand-builder that took customer proximity seriously. What changed is that the gap between wanting to ask and being able to ask essentially closed. When a product team at GoLighter wants to understand a churn cohort, when GoSpring wants to test a new checkout flow, when MySummer is validating a launch — the infrastructure is already there.
The cadence of customer research stopped being limited by tooling. It's now limited only by curiosity.
The bottom line
Wellster came to curly with a clear gap to close: the right ambition for customer research, but no tool fast, deep, or economical enough to match it. Two churn studies later, that gap is gone — and the partnership is now embedded across the portfolio.
We're looking forward to many more studies together
Want to hear what your hardest-to-reach customers would tell you?
curly.de or get in touch!
➰
