An Analysis of Variables Affecting Behavior Analytic Practitioners’ Intention to Leave a Position and Leave the Field

Training in trial-based functional analysis via computer-based instruction and behavioral skills training

Training Staff in Trial-Based Functional Analysis: What Works and What Falls Short

Knowing how to run a Trial-Based Functional Analysis correctly matters—not just for assessment accuracy, but for the treatment decisions that follow. This study examines whether online training alone prepares practitioners to implement TBFA, or whether live coaching remains essential. The findings offer practical guidance for supervisors building training systems in settings where in-person support is limited.

What is the research question being asked and why does it matter?

This study asked a straightforward training question: If a practitioner learns Trial-Based Functional Analysis through a short online lesson, can they then implement it correctly? Or do they still need live coaching?

This matters because many settings lack enough BCBAs to train staff in person, especially where ABA training systems are still developing. If training can happen partly online and partly through telehealth, more teams might be able to run TBFA with acceptable quality.

TBFA helps identify why challenging behavior occurs by embedding short “test” and “control” trials into normal routines. It often fits more easily into real classrooms and clinics than a traditional functional analysis. But it still requires careful, step-by-step implementation. If staff skip steps, change timing, or deliver the wrong consequence, the results can mislead—and that can lead to the wrong treatment plan.

The practical problem is that “knowing the steps” and “doing the steps under pressure” are not the same thing. Many agencies try to close training gaps with videos and quizzes. This study tested whether that approach is enough, and whether adding telehealth Behavioral Skills Training improves performance and helps it last.

What did the researchers do to answer that question?

Nine professionals in Japan who worked with children with challenging behavior participated. Most had no prior experience running functional analyses.

Everyone completed a computer-based instruction module online, which took about 40 minutes. The module included embedded quizzes that required correct answers before moving forward. Participants took the same 28-question multiple-choice test before and after the module.

Then each participant met by Zoom with a BCBA for role-play sessions. The BCBA played the role of a child while the participant ran TBFA trials across attention, tangible, and demand conditions—using the correct control and test steps, timing, and consequences.

The key outcome was procedural integrity: how many TBFA steps were performed correctly.

First, participants completed one role-play assessment after the online module but before any live coaching, with no feedback. Next, they received BST via telehealth—instruction, modeling, rehearsal, and specific performance feedback—until they reached 100% accuracy. About one month later, they completed a follow-up role-play, again with no feedback, to see what they retained.

Get quick tips
One practical ABA tip per week.
No spam. Unsubscribe anytime.

The study also tracked time spent logged into the online platform and collected simple satisfaction ratings after the follow-up.

How you can use this in your day-to-day clinical practice

Don’t treat “passed the quiz” as “ready to run the assessment.” Participants scored much higher on the knowledge test after the online module, and many got perfect scores. But when they had to run TBFA in role-play, none performed all steps correctly before live coaching.

The takeaway: TBFA training needs a performance check, not just a knowledge check. After any didactic training, schedule a short, structured role-play and score it with a checklist before the trainee works with a real learner.

Use online modules for what they do well: building vocabulary and step recognition. The online lesson clearly helped participants learn the procedure on paper. That can save supervision time—you can spend live coaching on the parts that typically fail in practice, like timing, transitions between control and test, and delivering consequences only when programmed.

Assign the online module as pre-work, then require the trainee to demonstrate the full trial sequence live before working with a client.

Center your live training on BST, especially performance feedback. Procedural integrity jumped to near-perfect levels right after BST, and most participants reached mastery quickly. Your live time should focus on modeling, rehearsal, and clear feedback—not long lectures.

When giving feedback, be specific about the step, the timing, and the consequence. Avoid vague comments like “be more clear” or “tighten it up.” Past TBFA training research suggests vague feedback can slow learning or even worsen performance.

Expect some skill loss unless you add supports. One month later, average accuracy dropped for many participants, even though they remained better than baseline.

This is a warning for real clinics: a one-time training event may not hold up when staff return to busy caseloads. Plan booster sessions—a scored role-play each month, or one in-vivo observation while staff run a trial in a natural routine. The goal is catching drift early. Small errors, like giving attention during a control segment or ending a trial late, can change the meaning of your data.

Don’t use “time spent in training” as your quality marker. The study found no link between time logged into the platform and later performance. Some people may leave the module open, skip parts, or rush.

Join The ABA Clubhouse — free weekly ABA CEUs

Build in active checks that require responding: a mastery quiz with a high cutoff, video-based questions where the trainee labels what step is happening, and a live competency check before field use.

Keep the limits of this evidence in mind. This study used role-play, not real clients, so we don’t know how well these skills generalized to actual sessions. It also had a small sample and no control group. Treat the findings as a practical training hint: online teaching supports knowledge, but it’s not enough for safe, accurate TBFA implementation.

Protect learner dignity and avoid “assessment as compliance.” TBFA involves setting up conditions that momentarily remove attention, tangibles, or escape. Before a trainee runs it with a learner, make sure they can explain why the assessment is being done, what safeguards are in place, and how you’ll stop if risk increases. Make sure the team has a plan to move quickly from assessment results to a supportive, function-based intervention that teaches skills—not just reduces behavior.

A reasonable training sequence, based on these results:

  1. Assign brief online instruction to cover the steps
  2. Require a scored role-play for each condition
  3. Deliver BST with modeling and feedback until the trainee hits 100% on your checklist
  4. Schedule a follow-up check within a month

If performance drops at follow-up, don’t blame the trainee. Add a short booster with feedback and make sure they get chances to practice with coaching in the real setting.


Works Cited

Togashi, K. (2025). Training in trial-based functional analysis via computer-based instruction and behavioral skills training. Behavior Analysis in Practice. https://doi.org/10.1007/s40617-025-01136-y

Leave a Comment

Your email address will not be published. Required fields are marked *