Incorporating Qualitative Data When Training Behavior Analysts
ABA training tends to rely heavily on numbers—test scores, checklists, competency rubrics. But numbers alone can miss what it actually feels like to be the learner. This paper explores how qualitative data—words, stories, reflections—can fill that gap. For supervisors and instructors, adding qualitative tools means catching problems earlier and making training fit better.
What is the research question being asked and why does it matter?
This paper asks a training question: how can behavior analysts use qualitative data to train students and supervisees more effectively? The authors focus on three areas: social validity (does the training feel fair and useful), metacognition (does the learner know what they know and don’t know), and learning outcomes (what the learner actually learned).
This matters because most ABA training leans hard on numbers and can miss what it feels like to be the learner. When we miss that, we may keep using teaching or supervision methods that “work on paper” but don’t work for that trainee in that setting.
Supervision and coursework involve power differences. A trainee may look “fine” in performance data but feel confused, unsafe, or burned out. If you never ask in a way that makes it safe to answer, you may not find out until the trainee quits, avoids you, or copies your feedback without real understanding. Qualitative tools help you notice problems earlier.
The paper isn’t saying to stop using quantitative measures. It’s saying that words and stories can add missing context so you can make better decisions. The goal is better fit—between your teaching methods and the learner, and between training demands and the real world the trainee is working in.
What did the researchers do to answer that question?
This is a discussion paper, not an experiment. No clients, trainees, or students were studied, and no outcome data were collected. The authors review key ideas from qualitative research and give examples of how an ABA instructor or supervisor could collect and use qualitative information during training.
They describe common qualitative options: open-ended written reflections, interviews, narrative inquiry (getting the person’s story), and thematic analysis (finding repeated themes in what people say).
For coursework, they describe ways to collect social validity feedback during the semester instead of only at the end. They warn that end-of-course teaching evaluations can be biased, especially against women and instructors of color. They suggest “action research” in teaching—setting a goal, trying a teaching change, collecting learner feedback, adjusting, and repeating.
For supervision, they suggest semi-structured check-ins and narrative-style questions to learn what’s going well and what’s hard. They also discuss metacognition supports like think-aloud problem solving, where the trainee explains their thinking while working through a task so the supervisor can see the process, not just the final answer.
Across these examples, they emphasize that the supervisor’s response matters: listen first, avoid making the person defend their feelings, and use the information to adjust support.
Because this isn’t a data-based study, the paper can’t tell you these strategies will always improve performance, reduce burnout, or increase cultural responsiveness. It offers a practice framework and points to other literature that suggests these areas are important.
How you can use this in your day-to-day clinical practice
If you supervise trainees or lead RBTs, add a small, repeating qualitative check-in that’s separate from skill scoring. Do it often enough that you can act on it, not just document it.
For example, every two to four weeks, ask the supervisee to answer three short prompts in writing: what’s going well, what feels hard, and what change would help supervision fit them better. Tell them up front what will happen with the feedback—what you can change now, what you can’t, and why. This makes the process feel safer and makes honest information more likely.
During supervision meetings, use at least one open-ended “story” question instead of only yes/no questions. Ask for a recent success story and a recent hard moment, then listen for what the trainee values and what situations trigger stress.
Your job isn’t to debate whether the stress is “reasonable.” Your job is to learn what variables are present so you can adjust training plans, staffing supports, and expectations. If the trainee says, “I felt overwhelmed,” follow with “What part felt biggest?” instead of “Why did you feel that way?”
Use qualitative data to improve social validity without turning supervision into a popularity contest. Social validity here means the training is understandable, humane, and connected to real job needs. You can keep high standards while still adapting how you teach.
When feedback requests something you can’t do (like “no data ever”), name the boundary and offer choices—simplifying the system, training fluency, or changing when data are taken. This protects dignity while still protecting client care.
Build metacognition into your supervision so you can see thinking, not just performance. A simple method is a “think-aloud” during clinical tasks. Ask the trainee to talk through how they would pick a function-based replacement skill, or how they would decide whether to change prompting, while you take notes on their steps.
Then coach the process: what steps were missing, what assumptions showed up, and what data they should check next time. This often helps more than correcting the final answer because it targets the decision-making chain.
Use reflections as permanent products that guide your training plan. After a hard case meeting, ask the trainee to write a short reflection: what they noticed, what they’re unsure about, and what support they want next.
Compare what they say they can do with what you observe them doing. If they think they’re strong in an area but performance shows gaps, you now have a clear coaching target. If performance looks fine but they report low confidence, you can add practice and feedback without waiting for an error in the field.
If you want to use qualitative tools to support culturally responsive practice, focus on values and context, not labels. Ask families and trainees questions that allow multiple “right answers”—what good progress looks like to them, what routines matter most, and what support feels respectful.
Then help the trainee notice when their own learning history shapes assumptions about “appropriate behavior” or “good parenting.” Don’t treat this as a one-time diversity talk. Treat it as an ongoing clinical skill: noticing your own bias cues, checking with the client, and making space for choice.
Be careful about power dynamics. Trainees may say “everything is fine” because they fear punishment, lost hours, or a damaged relationship. Make it clear that feedback won’t be used to retaliate, and show that by responding calmly when they disagree.
If you notice the trainee never disagrees, never brings up problems, or always accepts your feedback quickly, treat that as data that the environment may not feel safe yet. You can respond by offering anonymous feedback options, giving them time to write instead of speaking live, and reinforcing honest disagreement when it’s done respectfully.
In coursework or group supervision, avoid relying only on end-of-term ratings. If you’re an instructor or clinical leader, add brief mid-course or mid-rotation feedback points so the same learners benefit from the changes.
Interpret ratings with caution because bias is real. Use multiple data sources: participation, work products, direct observation, and the learner’s narrative feedback. Don’t let one harsh comment drive big changes, but do look for repeated themes across people and time.
Finally, treat qualitative data like any other data: collect it on purpose, organize it, and use it to make decisions you can explain. You don’t have to run a full research project, but you should be consistent.
Pick a small routine you can maintain, review the themes monthly, and make one practical adjustment at a time. Keep your clinical judgment in charge, and use qualitative information as added context—not as the only deciding factor.
Works Cited
Diller, J. W., & Li, A. (2025). Incorporating qualitative data when training behavior analysts. Behavior Analysis in Practice. https://doi.org/10.1007/s40617-025-01124-2



