An Analysis of Variables Affecting Behavior Analytic Practitioners’ Intention to Leave a Position and Leave the Field

Using AI-powered video feedback to improve ergonomics: An analog experiment

Using AI-Powered Video Feedback to Improve Workplace Posture

Neck pain and musculoskeletal problems are common among desk workers, often linked to prolonged forward head posture. While clinicians know that feedback can change behavior, delivering consistent, scalable posture feedback in real work settings remains a challenge. This study explores whether AI-generated video feedback offers a practical solution—and what clinicians should consider before adopting it.

What is the research question being asked and why does it matter?

This study asked a straightforward question: if you show someone an AI-generated video review of their posture, will they hold their neck in a safer position during desk work?

This matters because many people who work at desks spend significant time with their neck bent forward. Over time, this can contribute to neck pain and other musculoskeletal problems.

For clinicians, the bigger issue is how to deliver posture feedback in a way that’s fast and repeatable. In real settings, having a trained observer watch and score posture all day isn’t practical. Sustaining feedback long enough for changes to stick is equally difficult. This study examined whether an AI app can help solve that delivery problem.

The researchers also asked a second practical question: can the AI app measure body angles accurately enough to be useful? If the measurements are far off, any feedback based on them could be misleading. So they compared the app’s angle scores to a sensor-based system often treated as a gold standard.

What did the researchers do to answer that question?

Ten college students performed a desk-like task in a small room set up to resemble an office. They built plastic block models while sitting at a table and looking at picture instructions placed on the table surface. Each trial lasted 2 minutes because the app only recorded videos of that length.

The main outcome was the percentage of each 2-minute trial where the person’s neck was in a “low-risk” angle versus “medium-risk” or “high-risk.” The researchers chose the neck as the target because, during baseline, it was the body part that most often showed medium-risk posture across all participants.

They used a multiple baseline design across participants with three main phases. In baseline, participants simply did the task with no coaching. In the information phase, the researcher read a short set of general “good sitting posture” rules once before the first trial. In the video feedback phase, participants watched a 1-minute clip from their previous trial before each new trial, with an AI overlay using colors to label neck posture risk (green, yellow, red).

After posture improved, the researchers tried “thinning” feedback so it wasn’t needed every time. Some participants received feedback every 3rd trial, others every 5th, based on how steady their low-risk neck posture remained.

Get quick tips
One practical ABA tip per week.
No spam. Unsubscribe anytime.

They also ran two validation tests comparing the AI app’s joint angle scores to a 17-sensor motion capture system—once with still poses and once during movement.

How you can use this in your day-to-day clinical practice

If you’re working to improve staff ergonomics—or even caregiver posture during table work—this study supports a basic approach: information alone usually isn’t enough, but information plus video feedback can help.

In this study, reading posture rules led to small, short-lived improvement for most participants and no improvement for one. The bigger, steadier change came when people saw themselves on video with clear risk-level cues. If your current approach relies mostly on training or reminders, consider adding short video-based feedback as the main active ingredient.

Use these results as guidance for setting up feedback, not as proof that an AI app will work in every workplace. These were college students doing a simulated task for research credit, not employees working under real deadlines. Still, the workflow resembles something you could do in supervision: record a short sample, review it immediately, then practice again. The key is quick turnaround so the person can connect what they see to what they do next.

Keep the target behavior tight and observable. The researchers didn’t try to fix everything at once. They picked one body part—the neck—because it was the clearest problem in baseline. You can do the same: identify the one posture that’s most risky or most linked to discomfort for that person and task. This prevents ergonomics from becoming a long list of rules no one can follow.

Deliver feedback in a way that supports choice and dignity. Video feedback can feel personal and raise privacy concerns. If you use video at work, get clear consent, explain who will see it, and explain how long it will be stored. The study authors warned that ergonomic data could be misused to punish workers. Set the expectation that the purpose is support and problem-solving, not discipline.

Plan for fading, because constant feedback may not be realistic. In this study, many participants maintained good neck posture even when feedback dropped to every 3rd or 5th trial—but not everyone did. Two participants showed declines when feedback was thinned and needed more frequent feedback again. Build a fading plan, but watch the data and be ready to increase feedback frequency if performance drops. Don’t assume one fading schedule fits all staff or tasks.

Use booster feedback as a normal part of maintenance. The data suggest some people will need periodic check-ins to keep posture safe, especially when tasks change or fatigue sets in. Schedule brief booster sessions—a short video review every few days at first, then weekly, then monthly—and adjust based on performance. Removing support can lead to drift, so fading should still leave some ongoing contact with feedback.

Join The ABA Clubhouse — free weekly ABA CEUs

Be careful about over-trusting the AI numbers. The app’s angles were fairly close to the sensor system on average, but differences were larger for some body parts and conditions. Camera placement also affects what the app “sees.” Treat the AI output as one data source, not the whole truth. If the app flags someone as “high risk” but the video looks fine, troubleshoot the setup instead of pushing the person to change based only on the score.

Don’t ignore the environment just because you have feedback. This study didn’t test chair height, monitor height, or material placement, but the task setup likely pushed people to look down at the table frequently. In real work, you may get bigger and more comfortable changes by moving materials up (like a document holder), adjusting screen height, or changing keyboard and mouse position. Use video feedback to help the person notice and practice, while also fixing simple setup problems that make good posture difficult.

Finally, remember what this does and doesn’t apply to. This focused only on neck angle during seated desk-like work, with short sessions, and with people who likely didn’t have major injuries. It doesn’t tell you what to do for workers with pain conditions, physical limitations, or jobs involving different movements. Apply the general lesson—fast, clear video feedback can improve a specific posture behavior—and then tailor the plan to the person, the task, and the setting.


Works Cited

Espericueta Luna, A., Wu, Y. J., Luo, Y., Hu, B., & Gravina, N. (2025). Using AI-powered video feedback to improve ergonomics: An analog experiment. Journal of Organizational Behavior Management. https://doi.org/10.1080/01608061.2025.2482157

Leave a Comment

Your email address will not be published. Required fields are marked *