Designing Evidence: How Behavioral Research Makes HTA More Human

Female researcher in lab coat placing sticky notes over clinical charts, with an open HTA report nearby. Charts subtly transform into human figures and emotion icons.

Health Technology Assessment (HTA) relies on rigorous evidence to inform decisions about what technologies, treatments, and interventions should be funded, prioritized, or approved. Yet much of the existing evidence infrastructure is blind to behavioral drivers: the social, emotional, and cognitive patterns that shape real-world health outcomes. This article explores how behavioral research can enrich HTA by making it more human-centered, explanatory, and actionable. It outlines current limitations in HTA evidence models, proposes how behavioral methods can close those gaps, and suggests future pathways for integrating this lens across the HTA lifecycle.

Introduction

In the world of HTA, the gold standard is evidence. Evidence to prove efficacy. Evidence to calculate value. Evidence to compare options. But what counts as evidence, and who gets to decide, remains a critical blind spot in global health systems. The processes used to generate this evidence are often clinical, statistical, and economic. What they lack is insight into behavior.

Behavioral research offers a path forward. It can reveal why people adopt or abandon a treatment. Why clinicians resist new protocols. Why patients fail to adhere or exceed expectations. These dynamics shape outcomes just as much as any molecule or policy. When HTA integrates behavioral research, it moves closer to evaluating technologies not only on what they do in trials, but how they live in people’s lives.

The Limits of Traditional Evidence in HTA

HTA has traditionally prioritized clinical effectiveness and cost-effectiveness as the cornerstones of decision-making. Randomized controlled trials (RCTs), systematic reviews, and economic modeling dominate submissions and evaluations.

But these methods often miss the context in which technologies are used. They may show whether a drug works in ideal conditions but not whether people will choose it, stick with it, or experience its effects in meaningful ways.

This gap has led to growing interest in Patient Experience Data (PED) and patient-centered outcomes, which provide critical information about the lived realities of care. Still, PED remains underutilized, often siloed or appended late in the assessment process.

Behavioral insights are not meant to replace these methods. Instead, they can amplify their relevance by grounding them in real-world human behavior.

Behavioral Science as Evidence Generator

Behavioral research generates evidence about how people actually make decisions, navigate systems, and respond to interventions. It can explain phenomena that clinical trials alone cannot:

  • Why patients with the same prognosis behave differently
  • Why uptake of new tools varies across providers
  • Why a treatment with great efficacy may still fail in the real world

Behavioral evidence is often derived from mixed methods: ethnographic interviews, choice architecture assessments, in-situ observation, A/B testing, journey mapping, and behavioral audits.

These tools help surface cognitive biasesemotional triggerssocial norms, and environmental frictions, all of which shape the outcomes HTA aims to evaluate.

From Participants to Partners

A defining feature of behavioral research is its commitment to participatory methods. Patients, caregivers, and clinicians are not seen as passive subjects, but as active co-creators of insight.

Where surveys may collect data retrospectively, behavioral design studies often engage participants longitudinally, capturing shifting behaviors, motivations, and barriers in real time. This makes them especially valuable in chronic care, mental health, and adherence-focused interventions.

Incorporating these methods early in the HTA lifecycle could:

  • Inform technology design before market entry
  • Improve stakeholder alignment during evidence generation
  • Reveal misalignments between intended use and actual behavior

Designing with the people affected by technologies strengthens the quality and legitimacy of HTA decisions.

What HTA Can Learn from Behavioral Design

Behavioral design brings structured approaches to influence behavior ethically and effectively. Frameworks like MINDSPACE (Messenger, Incentives, Norms, Defaults, Salience, Priming, Affect, Commitment, Ego) and EAST (Easy, Attractive, Social, Timely) have been widely adopted in public health policy and could inform HTA practice as well.

These frameworks offer a way to test not only what works, but how and why: helping HTA bodies move from evaluating technologies in isolation to understanding their ecosystem of use.

Behavioral design can:

  • Highlight assumptions baked into cost-effectiveness models
  • Guide the selection of endpoints that reflect real human value
  • Shape stakeholder engagement strategies that go beyond tokenism

By integrating behavioral design into HTA, we move toward a model of assessment that is both rigorous and relational, one that acknowledges that technologies do not operate in a vacuum.

Toward a More Human-Centered HTA

Evidence matters. But it must be evidence that reflects the realities of use, not only the rigor of design. Behavioral research offers tools to humanize HTA to ensure that what gets measured, funded, and scaled aligns with what people actually need, want, and do.

This is not a call to dilute standards, but to expand them. To recognize that behavior is not a confounding variable to control, but a core dimension to understand.

If HTA is to evolve alongside the complex systems it seeks to guide, then behavioral research must be part of its foundation.

Conclusion

What we call evidence is ultimately a mirror of what systems choose to value. When HTA frameworks begin to reflect behavior in all its complexity, unpredictability, and context-dependence, they also begin to reflect life. The shift toward behavioral research is not a methodological trend. It is a quiet reorientation toward care, coherence, and credibility in how we measure what matters. The more human our evidence becomes, the more justifiable and just our decisions will be.

Footnotes

  1. Oliver, A. (2017). MINDSPACE: Influencing behaviour through public policy. Behavioural Public Policy, 1(2), 258–266. https://doi.org/10.1017/bpp.2017.13
  2. The Behavioural Insights Team. (2014). EAST: Four simple ways to apply behavioural insightshttps://www.bi.team/publications/east-four-simple-ways-to-apply-behavioural-insights/
  3. Facey, K., Rannanheimo, P., Batchelor, L., Borchardt, M., de Boer, W., & Thomas, V. (2022). Real-world evidence to support HTA evaluations: A case for patient-reported outcomes and experience data. International Journal of Technology Assessment in Health Care, 38(1). https://doi.org/10.1017/S0266462321000711
  4. Peters, D. H., Tran, N. T., & Adam, T. (2013). Implementation research in health: A practical guide. World Health Organization. https://apps.who.int/iris/handle/10665/91758

Read more