Beyond Social Validity: Using Qualitative Methods in ABA Practice
When behavior happens out of sight—like a teenager running away from a foster placement—traditional observation falls short. This paper explores how ABA providers can use qualitative methods to build better assessments and interventions for these complex, high-stakes situations. Understanding these approaches matters because the alternative is often guesswork, which can lead to plans that miss the point or make things worse.
What is the research question being asked and why does it matter?
This paper asks how ABA providers can use qualitative methods—not just rating scales or “social validity” questions—to build better assessments and interventions for complex problems. The authors focus on two areas where typical ABA observation falls short: youth running away from foster care and youth being targeted for human trafficking. In these settings, the most important events often happen when adults are not present.
This matters because when we cannot see the behavior chain, systems often default to control and punishment. The paper describes common responses to youth running away: loss of privileges, level drops, more surveillance after return. Those steps may feel logical to adults, but they can miss why the youth ran and can damage trust. If we want plans that work—and that youth will accept—we need ways to learn what the youth was trying to get or escape.
The authors also argue that qualitative data helps ABA teams avoid designing interventions that look fine on paper but do not fit the setting. Child welfare teams have heavy workloads, limited resources, and high turnover. If an intervention is not realistic for caseworkers, foster parents, or group home staff, it will not be implemented well. Qualitative methods can reveal these barriers early, so you can adjust before rolling out a plan.
What did the researchers do to answer that question?
This is a methods-focused paper with examples from the authors’ past work, not a new controlled trial. The main example is the development of a structured interview for youth who run away from foster care placements, called the Functional Assessment Interview for Runaways (FAIR).
To build the FAIR, the team used focus groups and structured questions with people closest to the problem. They ran separate groups with child welfare staff, teachers, and youth with runaway histories—including youth who had aged out and could reflect back. They kept groups similar by age or role so people would feel safer speaking openly. Sessions were recorded, transcribed, and coded by multiple team members using a systematic theme-finding process.
From these interviews, the team identified repeating themes about why youth ran. They grouped reasons into two broad patterns: running “to” access something (friends, family, independence, normal teen activities, or sometimes risky activities) and running “from” something (conflict, restrictive rules, unsafe or disliked living situations, negative interactions). They used these themes to rewrite interview questions in ways that built rapport and got clearer information—such as asking what was “good” and “not so good” on the run before asking if those were reasons for leaving.
They also used a second round of focus groups to build an intervention guide matching likely functions and reinforcers. The paper notes an unpublished pilot showing decreases in running away when plans were matched to FAIR results, but details are not provided.
For human trafficking, the authors describe how qualitative case files and provider interviews can identify common “lures” and grooming tactics, which can then guide prevention teaching. They describe behavior skills training (BST) and in-situ practice as possible formats, while noting the ethical challenge of not putting youth in real danger.
How you can use this in your day-to-day clinical practice
When a problem behavior is rare, dangerous, or happens out of sight, plan from the start to use more than direct observation. Running away, exploitation risk, and many high-stakes choices happen privately. If you rely only on staff checklists or quick rating scales, you may build a plan around adult guesses. Instead, treat the learner’s story as important data and collect it carefully.
Use interviewing to build a functional hypothesis, but choose language that protects dignity and supports honesty. The paper’s key clinical move is shifting from “Why did you do that?” to questions that are easier to answer and less blaming: “What was good about it?” and “What was not so good about it?” Then you can ask, “Was that part of why you left?”
This matters because a youth may not trust you enough to answer direct “why” questions, especially if they expect punishment. Your assessment quality depends on rapport, not just your form.
In practice, separate “running to” from “running from” early—they lead to different interventions.
If the youth ran to access people, activities, or independence, you will likely need to build safe, planned access to those needs. That might look like scheduled contact with supportive people, transportation plans, supervised community access, or choice-making opportunities that feel age-appropriate.
If the youth ran from conflict, restrictive rules, or unsafe conditions, you will need to change the environment, not just the youth’s behavior. That might include reducing power struggles, changing unnecessary house rules, improving staff responses to minor rule-breaking, or addressing peer conflict in the placement.
Do not build a plan that only addresses “what happens after they return.” Post-run punishment can increase secrecy and reduce the chance the youth will share what happened. A better target is prevention routines and replacement paths that meet the same needs with less risk. If you must set safety limits, pair them with clear, respectful explanations and meaningful choices so the youth still experiences some control. Your goal is stability and safety, not “winning” compliance.
If you work in foster care, group homes, or similar systems, design interventions that staff can actually do. The paper highlights feasibility as a core reason to use qualitative input. Before finalizing a behavior plan, ask staff what is realistic across shifts, what rules are non-negotiable due to policy, and what resources are missing. If the plan requires perfect staffing, constant monitoring, or meetings that never happen, it is not a good plan—even if it is behavior-analytic.
Consider adding simple, repeatable qualitative routines to your standard assessment process. After a critical incident, you can do a brief structured debrief with the youth and with staff using the same open-ended prompts each time. Keep prompts focused on setting events, triggers, what the youth gained or avoided, and what would have made it easier to choose a safer option. You are not doing therapy—you are gathering data to build a safer environment.
For trafficking risk, treat “lures” as part of the functional context, not as moral failure. The paper describes lures like rides, money, belonging, romance, and emotional support. These offers often match real unmet needs. This means you should assess what needs are not being met and how the youth is trying to meet them. If a youth is motivated by connection, the plan should include safe connection—not just warnings about strangers.
If you decide to teach safety skills, use active practice rather than only giving information. The paper supports BST elements: clear instructions, models, rehearsal, and feedback, with practice that looks like real life. You can use role-play and simulations to teach what to say, how to handle messages, how to exit situations, and how to get help. Keep scenarios trauma-informed and voluntary, and watch for signs the learner is overwhelmed. You do not need to recreate dangerous situations to teach a safer response chain.
Stay honest about limits. Qualitative data can improve fit and guide hypotheses, but it does not prove function the way experimental functional analysis can. People can misremember, minimize, or test you with answers—especially when safety and fear are involved. Use interviews as one data source and pair them with whatever objective data you can get: attendance patterns, placement changes, phone access rules, contact logs, staff response patterns, and timelines of events. Update your hypothesis when new information appears.
Finally, do not assume these methods apply the same way to every learner. Youth in foster care, youth with trauma histories, and youth at risk of exploitation may have different needs than learners you typically serve in clinic-based ABA. If you use these strategies, do it with humility, consent when possible, and collaboration with the team already supporting the youth. Use qualitative input to increase choice, safety, and respect—while still using clinical judgment to decide what is appropriate for each case.
Works Cited
Crosland, K., Garcia, A., & Fuller, A. (2025). Beyond social validity: Embracing qualitative research in behavior analysis. Behavior Analysis in Practice, 18. https://doi.org/10.1007/s40617-025-01131-3



