Can a Better Data Sheet Improve Tact Training?
Many clinics assume that handing staff a data sheet means they’ll collect accurate data. But that’s not always true. This study asks a practical question: can a redesigned data sheet—one that prompts key steps and pre-sets the trial order—help staff run tact training more correctly and record data more accurately? For supervisors looking to improve program quality without adding more meetings or oversight, the findings offer a simple, low-cost place to start.
What Is the Research Question, and Why Does It Matter?
The question is whether an “enhanced” data sheet helps staff run tact training more accurately and collect better data than a standard sheet.
This matters because even small missed steps in teaching can slow learning or build faulty stimulus control. In real clinics, common errors include skipping attending checks, delivering reinforcement incorrectly, and rotating targets in the wrong order.
If a data sheet can prompt staff to complete key steps, you may get cleaner teaching and cleaner data—without adding supervision hours.
The second issue is data accuracy. If data are wrong, a BCBA may make a poor decision: changing a program too soon or continuing one that isn’t working. This study tests whether changing the data sheet can reduce those errors during tact training of features (like teaching “handle” on a toothbrush).
For a busy supervisor, the practical question is simple: should we change the form we use so staff make fewer mistakes?
What Did the Researchers Do?
They recruited 20 ABA staff—mostly technicians and RBTs, with a few BCBAs—and randomly assigned each to one of two groups.
- Standard data sheet group: Targets were not pre-set, and there were no prompts for key steps.
- Enhanced data sheet group: Targets and trial order were already printed, with spaces to mark attending and reinforcement.
Everyone watched a short training video showing the tact-training procedure with their assigned sheet. Then each person ran one 12-trial teaching session over Zoom with an adult confederate acting as a child with autism. The session included three items, two features per item, with each feature taught twice.
Observers scored how well each instructor completed key steps: securing attending first, delivering reinforcement correctly, rotating targets in the proper order, and ensuring each feature was taught twice. They also compared each participant’s recorded data against the observer’s data to measure accuracy.
How to Use This in Your Day-to-Day Practice
Start with the data sheet, not just the staff.
If you’re seeing messy data or uneven teaching in tact programs, look at the form first. In this study, the biggest gap was data accuracy: people using the enhanced sheet had much higher average accuracy than those using the standard sheet. The standard-sheet group often recorded data in the wrong row, left trials blank, or missed key parts of the trial.
Don’t assume a basic trial-by-trial sheet will produce usable data—especially when staff are new, rushed, or juggling materials.
Pre-plan the trial order when rotation matters.
The enhanced sheet listed targets and order in advance. Staff using it hit 100% accuracy on presenting targets in random order. With the standard sheet, random-order accuracy was lower.
If your goal is to reduce faulty stimulus control, your sheet should make correct rotation easy and incorrect rotation harder. For tacting features, this can be as simple as printing the 12-trial sequence with the item and feature already written in each row—rather than asking staff to decide the order while teaching.
Add brief prompts for commonly missed steps.
The enhanced sheet included spaces to mark attending and reinforcement delivery. While those differences weren’t as large as the data-accuracy gap, the enhanced group still completed those steps more consistently.
In real sessions, missed attending checks often happen when the learner is distracted or staff feel pressured to move quickly. A checkbox that requires marking attending can slow things down just enough to help the learner contact the right stimulus—as long as the step is respectful and brief.
Use enhanced sheets as a coaching tool during onboarding.
This study used only brief video training plus one practice session, yet the enhanced sheet still helped. That suggests it can serve as “training wheels” for new staff while they build fluency.
Start new hires on enhanced sheets for programs with higher risk of stimulus-control problems—feature/function/class tacts, multiple exemplars, or mixed targets. As the staff member shows stable fidelity, you can decide whether to fade the prompts or keep them.
Don’t assume an enhanced sheet fixes everything.
This study was a role-play over telehealth with an adult confederate—staff only ran one short session. Real learners may engage in problem behavior, leave the area, refuse materials, or need preference changes. Those challenges can affect fidelity in ways this study didn’t capture.
Treat this as support for a low-cost systems change, not proof that learner outcomes will automatically improve. You still need direct observation, feedback, and clinical judgment.
Watch for hidden costs before switching every form.
Enhanced sheets take time to build and update, especially if every program uses a different format. Too many sheet styles can confuse staff and raise response effort.
A balanced approach: standardize one enhanced template that works across many tact programs, with only a few fields that change (the target list and planned order). Keep the layout consistent so staff don’t have to relearn the sheet for each client.
Build in routine accuracy checks.
Poor data accuracy was common with the standard sheet—and still varied widely with the enhanced sheet. Some participants scored 0% accuracy even with the enhanced version. A better sheet is not a guarantee.
Do quick spot checks: a supervisor runs five minutes of simultaneous data collection and compares trial-by-trial agreement. If agreement is low, simplify the sheet, retrain the specific error, and reduce extra paperwork during teaching so the learner stays the priority.
When redesigning a tact data sheet, aim for three goals:
- Pre-set the trial sequence so correct rotation is the default.
- Prompt key trial components that are often missed (attending and reinforcement).
- Make it easy to mark responses in the right place every time.
If the form pulls staff’s attention away from the learner too often—or feels like “more boxes than teaching”—revise it. The point is to protect learning quality and learner dignity while making it easier for staff to do the right steps under real clinic pressure.
Works Cited
Halbur, M., Reidy, J., Kodak, T., Cowan, L., & Harman, M. (2024). Comparison of enhanced and standard data sheets on treatment fidelity and data collection for tact training. Behavior Analysis in Practice, 17, 533–543. https://doi.org/10.1007/s40617-023-00869-y



