Using the ADDIE model of instructional design to create programming for comprehensive ABA treatment

Using the ADDIE model of instructional design to create programming for comprehensive ABA treatment

Using the ADDIE Model of Instructional Design to Create Programming for Comprehensive ABA Treatment

Building a comprehensive ABA treatment plan—one that spans language, play, daily living, and behavior supports—requires more than clinical skill. It requires a clear, repeatable process. This article explores how the ADDIE instructional design model can help BCBAs organize their planning, make more intentional decisions, and explain those decisions to families and teams.


What is the research question being asked and why does it matter?

The main question is straightforward: How can ABA clinicians build and manage truly comprehensive treatment plans in a clear, repeatable way?

Many ABA articles focus on increasing or decreasing a small set of behaviors. But in real clinics, BCBAs often plan dozens or hundreds of skills across language, play, daily living, learning readiness, and behavior supports—all at once.

This matters because comprehensive work breaks down without organization. You can end up with random targets pulled from an assessment, weak prerequisites, unclear mastery rules, and poor follow-through by staff. That wastes time, frustrates learners and caregivers, and makes progress hard to see.

This paper is not testing a new intervention. It’s a practical review suggesting the ADDIE instructional design model (Analyze, Design, Develop, Implement, Evaluate) as a roadmap for comprehensive ABA programming. The value lies in having a process you can repeat—not in claiming it will work for every client.


What did the researchers do to answer that question?

The authors reviewed tools already used in education and training, then mapped them onto comprehensive ABA programming. They organized common clinician tasks into five parts: Analyze what to teach, Design the overall plan, Develop the teaching details, Implement through staff, and Evaluate whether it’s working.

For each part, they suggested actions a BCBA can take to improve quality. Examples include choosing assessments deliberately (not by habit), planning skill order based on prerequisites (not just the next item on an assessment), and defining mastery before teaching starts. They also highlighted the need for staff training and enough BCBA oversight to protect fidelity.

A key point: “Evaluate” sits in the center. You’re not only checking graphs. You’re also checking if goals matter to the learner and family, if procedures are acceptable, and if outcomes improve life outside sessions.

The authors are clear about a major limitation: this is not an outcome study. It does not show that following ADDIE causes better results than other planning methods.


How you can use this in your day-to-day clinical practice

Use ADDIE as a weekly planning checklist—not as extra paperwork. If you supervise comprehensive cases, pick one client and run your next program update through the five parts. The goal is fewer “random” decisions and more “named” decisions you can explain. This helps especially when you inherit a case or when a learner isn’t progressing and you need a clean way to find where the plan broke.

Analyze

Change your habit from “pick an assessment” to “state why this assessment fits this learner.” Write one plain sentence in your notes: “We’re using VB-MAPP to map early language and learning readiness, and Vineland to check if gains show up in daily life.”

Get quick tips
One practical ABA tip per week.
No spam. Unsubscribe anytime.

This keeps you from treating the assessment like a curriculum. It also supports dignity—it pushes you to ask, “Is this skill important for this child in this family?” rather than “Can we check this box?”

Don’t rely on one tool. If your main assessment is strong for language but weak for play, add a short play assessment. If your main tool is criterion-based, consider a norm-referenced tool for a bigger picture—especially when families or funders ask how progress compares to typical development.

Use secondary assessments to fill gaps, but remember: many tools have uneven reliability and validity. Treat results as helpful clues, not truth, and confirm with direct observation and ongoing data.

Design

Stop building plans by marching through an assessment in order. Instead, write a simple scope and sequence for the next three months. Decide which skills are prerequisites for others and which can wait.

For example, if you want conversation goals, you may need earlier listener and tact skills—but research doesn’t always support one fixed order for all learners. Make your sequence a hypothesis, then change it when data disagree.

Define “complete” before you teach. Pick mastery rules that match the real need of the skill. If the skill must last without practice (like safety responses), you may want higher mastery, more examples, and planned maintenance checks. If the skill will be used daily and naturally reinforced (like some mands), you might move faster once it shows up at home.

Write mastery as something meaningful—use across people, settings, and materials—instead of only “80% for 3 days.” That number alone doesn’t guarantee real-life use.

Develop

Treat teaching format and materials as things you can test, not as personal style. If discrete trials are slow or stressful for a learner, try a more embedded or naturalistic format for that same target and compare data. If pictures aren’t working, try objects, real actions, video, or having the learner do the action and label it.

Probe quickly which format and materials produce faster learning with fewer struggles, while still respecting the learner’s comfort and choice.

When you pick procedures—prompting, error correction, target mixing, reinforcement details—base the first choice on your best rationale, then let data decide. If progress stalls, don’t blame the learner or technician first. Re-check prerequisites, motivation, and the clarity of teaching steps.

Use a structured troubleshooting flow so your team doesn’t just try random things. The goal is steady improvement with the least intrusive, most teachable plan.

Join The ABA Clubhouse — free weekly ABA CEUs

Implement

Treat technician training as part of the intervention, not a separate task. Use behavioral skills training until staff can show the skill, not just describe it. Then keep checking fidelity—small errors can erase progress in comprehensive programs.

If your caseload or schedule doesn’t allow enough observation and coaching, name that as a clinical risk and advocate for safer oversight levels. This paper points to common standards expecting meaningful BCBA time with the learner present. You can use that when talking with leadership and funders.

Evaluate

Widen what you measure. Keep your skill and behavior graphs, but add simple social validity checks: Do caregivers still want this goal? Does the learner show assent for how we teach? Is this showing up at home or school?

If a learner resists procedures, treat it as important data. Offer choices, adjust demands, and reconsider procedures that may be effective but not acceptable for that learner and family. This isn’t about avoiding all hard work—it’s about making sure goals and methods remain respectful and worth the cost.


Final thoughts

This model won’t replace clinical judgment, and it won’t guarantee outcomes. But it can help you work more consistently, notice gaps earlier, and explain your programming decisions in a way that protects the learner’s quality of life.

If you make one change after reading this: for every new program you write, add one sentence stating why this goal matters and how you’ll know it helped outside the therapy room.


Works Cited

LaMarca, V. J., & LaMarca, J. M. (2024). Using the ADDIE model of instructional design to create programming for comprehensive ABA treatment. Behavior Analysis in Practice, 17, 371–388. https://doi.org/10.1007/s40617-024-00908-2

Leave a Comment

Your email address will not be published. Required fields are marked *