Beyond Numbers: How Qualitative Methods Can Strengthen ABA Practice
Behavior analysts rely heavily on measurable outcomes—frequency, duration, percentage—to guide treatment decisions. But what happens when the numbers don’t tell the whole story? This article explores how qualitative methods, like open-ended interviews and reflective listening, can help clinicians understand the real-life context behind the data. When families are stressed, when plans don’t stick, or when something important gets missed, these skills can make the difference between a plan that works on paper and one that actually fits a family’s life.
What is the research question being asked and why does it matter?
This paper asks how qualitative methods—like open-ended interviews and personal stories—can help behavior analysts understand what life is really like for clients and families. The author is not testing a treatment package. Instead, she is asking whether “numbers-only” data miss important parts of a person’s experience that matter for good care.
This matters because many ABA decisions are based on what is easy to count: frequency, percent, duration. Those measures are useful, but they do not always explain why a plan is hard for a family to follow, what feels unsafe, or what matters most to the learner. Without understanding the family’s real situation, you can write a plan that looks great on paper but does not fit daily life.
It also matters because ABA services often happen when families are stressed and tired. If your assessment and check-ins only focus on “what behavior happened,” you may miss key barriers—fear, burnout, cultural values, or past bad experiences with services. Qualitative-style questions can help you find those barriers early and avoid pushing a plan that adds strain or damages trust.
What did the researchers do to answer that question?
The author used a personal narrative approach. She describes her own training in a quantitative ABA culture, then explains how she came to use qualitative methods during her PhD work on pediatric feeding difficulties. Her main example is semi-structured interviews with parents, where the interviewer uses a guide but can follow the parent’s lead with follow-up questions.
She shares what she learned from listening to parents’ stories, including one detailed example of a family whose teen had severe feeding problems and limited support. The point is not “here is the best feeding protocol.” The point is that the parent’s story revealed risk, fear, and real-life constraints that would never show up in a graph of bites or acceptance.
She then connects qualitative methods to ABA practice skills. She explains differences between structured interviews (fixed questions, same order) and semi-structured interviews (a plan, but flexible follow-ups). She also highlights specific communication behaviors that qualitative training tends to teach well—active listening, reflective statements, and the ability to pause and let silence happen.
A key limitation: this is not an outcome study or controlled evaluation. It is one clinician-researcher’s reflection, plus examples from her experience and related literature. Treat it as practice guidance and a prompt to improve your assessment and rapport—not as proof that qualitative methods will improve client outcomes in every case.
How you can use this in your day-to-day clinical practice
When you run intakes and FAIs, treat them less like a checklist and more like a guided conversation. You can still cover required questions, but do not rush past a caregiver’s answer just to reach the next item.
If a caregiver says, “Meals are awful,” your next step should not be “rate refusal from 1 to 5.” It should be a simple follow-up: “What does ‘awful’ look like at your house?” or “What’s the hardest part for you?” This helps you uncover setting events, safety risks, and history that might change your plan.
Use semi-structured interviewing on purpose, not by accident. Before the meeting, decide your “must cover” topics: risk, routines, what has been tried, goals, who is involved. During the meeting, give yourself permission to follow the caregiver’s lead when something important comes up.
After the meeting, write down what you learned that would not show up in ABC data—things like “family is afraid of medical procedures,” “school staff dismissed concerns,” or “caregiver is overwhelmed and cannot run 30-minute meal sessions.” Then build the plan around those facts, not around your ideal version of treatment.
Add reflective statements to your clinical toolkit, especially when the topic is emotional or sensitive. This is not therapy, and you do not need to “fix feelings.” You are checking understanding and showing you heard them.
Simple lines like, “It sounds like you’ve been dealing with this for a long time,” or “It sounds like the last plan didn’t work the way you hoped,” can reduce defensiveness and increase honest sharing. That honesty often leads to better treatment design because people tell you what they will actually do.
Practice using silence as a clinical skill. After a caregiver answers a hard question, pause for a few seconds instead of jumping in. Many people add key details after a short quiet moment.
This is a low-effort way to improve assessment quality. It helps caregivers feel less rushed and more respected. It also keeps you from interrupting when someone is trying to explain something they rarely get to say out loud.
Use qualitative-style questions to improve contextual fit—not to replace measurement. You still need observable targets and data. The change is that you also collect “word data” that explains what the numbers cannot.
For example, if a caregiver reports they can only practice a skill on weekends, that is not a “barrier” to push through. It is a real schedule limit that should shape your goals, your session design, and your expectations for generalization.
Be clear about whose voice you are missing and how you will get it. Many learners cannot easily describe their own experience with spoken language. That does not mean their experience does not matter.
In day-to-day work, you can build in choices, watch for assent and dissent, and ask caregivers what the learner does when they like or dislike something. If you only ask adults what matters, you may build plans that meet adult comfort more than learner dignity.
Use what you learn to adjust how you present recommendations. If a family has a long history of being dismissed, they may need more time, more collaboration, and smaller first steps to rebuild trust.
If you hear fear about safety, medical procedures, or placement, you may need to coordinate with other providers and set conservative, safety-first goals. The paper’s feeding example highlights that families may be thinking about very high-stakes outcomes while providers focus on small skill changes. Your job is to notice that gap early and plan with it in mind.
Do not overinterpret subjective reports—and do not treat them as “less real” than counts. Treat them as a different type of data that needs checking.
If a caregiver says, “He never eats,” you can respectfully clarify: “When you say never, do you mean zero bites, or just very little?” Then pair their report with observation and measurement. This keeps you honest, protects decision-making quality, and prevents you from designing a plan based on vague language.
Finally, use these skills to reduce burnout and improve teamwork—including your own. Better listening and better-fit plans can mean fewer stalled cases and fewer repeated “training” meetings where nothing changes at home.
This does not mean qualitative methods fix system problems, and it does not mean every case needs formal interviewing methods. It means your everyday practice can improve when you treat people’s stories as important clinical information and build plans that match real life.
Works Cited
Mejía-Buenaño, S. (2025). Beyond social validity: Embracing qualitative research in behavior analysis. *Behavior Analysis in Practice, 18*(1). https://doi.org/10.1007/s40617-025-01120-6



