An Analysis of Variables Affecting Behavior Analytic Practitioners’ Intention to Leave a Position and Leave the Field

Participatory approach to selecting technologies for instruction delivery at an electric heater manufacturing plant

Participatory Technology Selection in Manufacturing: Lessons for OBM Practice

When a workplace adopts new technology, the real question isn’t whether the tool is better on paper—it’s whether workers will actually use it. This study examines what happens when employees help choose the technology they’ll rely on every day. For BCBAs and OBM clinicians, the findings offer practical guidance on making technology decisions that stick.


What Is the Research Question, and Why Does It Matter?

The central question: when a factory wants to replace paper work instructions with a tech tool, should workers help pick it—and what happens when they do?

This matters because many workplaces buy technology that looks superior on paper, but workers don’t like using it. When that happens, usage drops, errors rise, frustration builds, and some workers may quit. For OBM clinicians, the key point is that “best” technology isn’t just about features. It’s about what workers will actually use, every day, across a full shift.

The plant in this report had a real problem with paper binders. Pages were torn, missing, and hard to keep updated. The operations manager spent time writing, printing, walking, and inserting instructions across nine stations—then doing it again whenever something changed. Delays happened when pages went missing, and outdated instructions stayed in binders long after products were discontinued.

The practical problem wasn’t just “paper vs. tablet.” It was wasted time, instruction errors, and slow updates affecting quality and schedule.


What Did the Researchers Do?

The team worked with one electric heater manufacturing plant, including all 21 assembly-line workers across two shifts. They used a participatory approach: workers tried both options and gave input that management agreed would inform the final decision.

The two options were ruggedized tablets at workstations or augmented reality (AR) glasses with a camera that pulled up the right instructions when the worker scanned a bar code. Each worker got a short hands-on demo of each option, with the demo order switched across shifts to reduce “first option bias.”

After the demos, workers completed a brief survey covering openness to new tech, training quality, device preference, whether they’d return to binders, whether the new system made the job easier, and whether they felt included in the decision. The OBM practitioner also spoke with workers one-on-one during breaks to gather informal feedback without peer influence.

Results were summarized as simple counts—no deeper statistical analysis and no direct measures of productivity, quality, or safety.


How to Use This in Day-to-Day Clinical Practice

Treat User Acceptance as a Real Outcome

When your site is choosing between technology options, treat user acceptance as a real outcome—not a side issue.

In this report, workers preferred tablets over AR glasses mainly because they didn’t want to wear a device on their face for an entire shift. That kind of barrier is easy for management to miss, but it directly predicts whether a tool will be used with fidelity.

Get quick tips
One practical ABA tip per week.
No spam. Unsubscribe anytime.

In practice, build in a structured “try it during real work” step before purchase. Make comfort, effort, and workflow fit part of the decision criteria.

Use a Simple, Quick, Ethical Participatory Process

Give staff a short demo of each option. Let them touch it, wear it, scan with it, and see the exact steps they’d do on the job. Then collect private feedback right away—with anonymity when possible—so workers can answer honestly without fear of looking “anti-change.”

The survey in this report was short and direct. It asked the key questions that predict adoption: preference, training adequacy, and whether the tool makes the job easier. You can copy this structure, but consider adding at least one question about safety and one about physical comfort to protect dignity and reduce injury risk.

Plan for the “All Day, Every Day” Test

The glasses option sounded hands-free and high-tech. Management expected workers to like it. They didn’t—because wearing it for eight hours felt burdensome.

In your work, ask staff to imagine the full shift and the worst-case moments: heat, sweat, PPE conflicts, fogging, headaches, battery issues, cleaning needs, and whether the tool gets in the way during tight or risky tasks. If the tool adds effort during stressful moments, usage will drop even if the tool is “better” in theory.

Include Cost Plus Behavior in Selection Criteria

The report gave clear cost estimates: tablets were cheaper per unit and didn’t require sharing across shifts in the same way.

As a clinician, you can help leaders avoid false savings by mapping likely behavior outcomes: How often will staff choose to use it? How often will they bypass it? What errors happen when they do?

A cheaper tool that gets used consistently can beat a more expensive tool that sits unused or creates workarounds. Your role is to bring behavior-based prediction into the business decision.

Don’t Assume Preference Equals Performance

This report didn’t measure error rates, build time, rework, or injuries before and after. It mainly measured preference and satisfaction, plus some time savings for the manager updating instructions.

Don’t promise that participatory selection will automatically improve production outcomes. What you can do is use this report to support a two-step plan: first, pick a tool workers will accept; second, evaluate the tool’s impact on objective outcomes once it’s in place.

Build Measurement into Implementation

After selection, set up a simple system to track outcomes that matter at your site:

Join The ABA Clubhouse — free weekly ABA CEUs

  • Number of instruction-related defects
  • Time to find the right instruction
  • Number of stops due to missing information
  • How often instructions need updates

Also measure use itself—how often the device is used per job, or whether workers switch back to memory or paper notes.

If possible, use an applied design realistic for workplaces, like a multiple baseline across stations or shifts, so no one loses a helpful tool just to “prove” the effect.

Support Training with Performance Checks

All workers in the report said they had adequate training, but the training was brief.

In your practice, train to fluency on the exact behaviors needed: locating the correct job, confirming station-specific steps, and checking setup notes before action. Then do short competency checks in the work area—not just in a classroom—and provide quick prompts that fade. This keeps the technology from becoming another source of errors.

Use Participation to Protect Dignity

The strongest practical takeaway: workers felt included and didn’t want to go back to binders. That’s a meaningful quality-of-work signal, even if it’s not a productivity metric.

When you run participatory selection, be clear that the goal is a system that supports people to do skilled work with less hassle and fewer mistakes. Keep the message away from “we need you to comply with the new tech” and closer to “we’re choosing a tool that fits your job and keeps instructions accurate.”


Works Cited

Goomas, D. T., & Ludwig, T. D. (2025). Participatory approach to selecting technologies for instruction delivery at an electric heater manufacturing plant. *Journal of Organizational Behavior Management, 45*(4), 324–332. https://doi.org/10.1080/01608061.2024.2443135

Leave a Comment

Your email address will not be published. Required fields are marked *