What Most People Get Wrong About Data Visualization & Analytics
You made a chart. You pulled the data, picked a graph type, and added some colors. But when you shared it in the parent meeting, you got blank stares. Or worse, someone made a decision based on what they thought the chart said—and it was wrong.
This happens more often than most ABA professionals want to admit. Data visualization and analytics mistakes cost clinics time, erode trust with families, and can lead to poor clinical choices. The good news? Most of these mistakes are easy to spot and fix once you know what to look for.
This guide is for BCBAs, clinic owners, supervisors, and anyone else who shares data with teams or families. You’ll learn the most common chart and analytics mistakes, understand why they matter, and get simple rules to fix them fast.
Start Here: Charts Are Decision Tools (Not Decorations)
A chart has one job: help someone decide what to do next.
It’s not a poster for your office wall. It’s not proof that you collected data. It’s a visual tool that should make the right call clearer and faster.
When a chart confuses people, they guess. They fill in blanks with assumptions. A parent might think progress is better than it is. A supervisor might miss a warning sign. A funder might question your credibility.
The risk of a confusing chart isn’t just wasted time—it’s wrong decisions.
Better charts aren’t fancier charts. Better means clearer and more honest. A simple bar chart that answers the question beats a colorful dashboard that raises more questions than it answers.
Quick Definitions in Plain Language
Before we go further, here are a few terms you’ll see throughout this guide.
Data visualization means a picture of data, like a chart or graph. Analytics means using data to answer a question. Axis refers to the lines with numbers on a chart, usually one on the side and one on the bottom. Baseline is what things looked like before you made a change—your comparison point.
If you want a simple review routine, save the checklist at the end and use it before you share any chart.
Ethics First: Privacy and Honest Progress Pictures
Before we talk about chart types or colors, we need to talk about ethics. Misleading visuals aren’t just a style problem—they’re an ethics problem because they can change care decisions.
When a chart makes progress look bigger than it is, families may expect outcomes that won’t happen. When a chart hides problems, teams may miss the chance to adjust. When you share data without thinking about who will see it, you may expose private information.
Privacy basics are straightforward. Only share what the audience needs. Protect client identity. Use role-based permissions when sharing dashboards. Avoid public links when possible, and if you must share externally, use password protection. For demos or training, use dummy datasets instead of real client data.
Honest progress pictures mean avoiding charts that exaggerate or minimize change. Truncated axes, cherry-picked timeframes, and 3D effects can all distort the truth. The goal is transparency—include the baseline, avoid arbitrary cutoffs, and design to inform rather than manipulate.
Human oversight is non-negotiable. Charts support judgment. They don’t replace it. Human review is required before anything enters the clinical record.
Before You Share: Ask These Two Questions
Every time you prepare to share a chart, pause and ask:
- Who will see this, and what do they need to decide?
- Does this chart protect privacy and tell the truth?
If you can’t answer both questions clearly, revise. If you lead a team, turn these two questions into a standard data share rule.
How to Read This Guide: The Bad vs Better Pattern
For each mistake, you’ll see what it looks like in real charts, why it misleads people, and a specific action you can take today to fix it.
You don’t need to fix everything at once. Pick one mistake you know you make. Fix that one first. Small changes, one chart at a time, add up to much clearer communication.
Building Your Own Mini Gallery
If you want to teach your team, consider creating side-by-side screenshots for each mistake. Add short callouts labeled “Problem” and “Fix.” Keep the same data in both versions so the improvement is obvious. Visual training helps more than written policies alone.
Mistake #1: Using the Wrong Chart Type
The most common data visualization mistake is choosing a chart that doesn’t match the question you’re trying to answer.
What this looks like: Pie charts with too many slices, fancy 3D charts that hide the point, or mixing totals and rates in one view. A pie chart with ten categories makes comparisons nearly impossible. A line chart for unrelated categories confuses the story.
Why it matters: People take the wrong message away. If your chart makes you explain too much, it’s probably the wrong type.
How to fix it: Choose a chart based on the question. Comparing categories? Use a bar chart. Showing change over time? Use a line chart. Many categories? A horizontal bar chart beats a pie chart almost every time.
Before you build a chart, write the question at the top. Then pick the chart that answers it.
Mistake #2: Clutter (Too Much in One Chart or Dashboard)
A cluttered chart is a confusing chart. Too many lines, too many colors, tiny text, or too many panels on one page—your audience stops reading or focuses on the wrong thing.
What this looks like: Dashboards crammed with every possible metric, charts with multiple overlapping data series, heavy gridlines, and decorations that add no meaning.
Why it matters: If your audience can’t find the main message in about ten seconds, the chart has failed.
The 10-second test: Show your dashboard to someone unfamiliar with it. Let them look for ten seconds. Hide the screen. Ask them what’s most important or whether things look good or bad. If they can’t answer, simplify.
How to fix it: Show one main story per chart. Split dashboards by audience. Remove extras like unnecessary gridlines, 3D effects, and loud backgrounds. Put the most critical information at the top left. Limit your dashboard to five to nine key metrics.
Mistake #3: Axis and Scale Problems (Charts That Mislead)
Axis choices can change the story your chart tells. A truncated axis can make a small change look dramatic. Uneven time ranges can make progress look faster or slower than it is.
What this looks like: A Y-axis that starts at 50 instead of zero, making a small improvement look huge. Uneven dates on the X-axis. Missing tick marks or unclear labels.
Why it matters: The chart can exaggerate change or hide it. This isn’t a style preference—it’s an ethics issue.
How to fix it: Label axes clearly with units and time range. Use consistent time windows. For bar charts, start the Y-axis at zero. If you zoom in for a valid reason, label it clearly and explain why.
If someone could accuse your chart of tricking them, pause and fix the scale.
Mistake #4: Color Problems (Meaning, Contrast, and Accessibility)
Color choices affect whether your audience can understand your chart at all. This isn’t about making things pretty—it’s about making things readable.
What this looks like: Too many colors competing for attention, low contrast that makes lines hard to see, red and green used for meaning when many people can’t distinguish them, or color as the only cue to tell data series apart.
Why it matters: About eight percent of men have some form of color vision difference. If your chart fails for them, it fails.
How to fix it: Use fewer colors. Use high contrast by varying brightness, not just hue. Add patterns or different line styles in addition to color. Label lines or bars directly instead of relying only on a legend. Keep color meanings consistent across all your charts.
Quick test: Print or preview your chart in grayscale. If you can’t tell the data series apart, fix the colors and labels.
Mistake #5: Missing Context (Labels, Units, Baselines, and Definitions)
A chart without context forces your audience to guess. When people fill in blanks with guesses, they often get it wrong.
What this looks like: No title or a vague title, unclear units, missing baseline, no indication of phase changes, undefined terms.
Why it matters: A graph without units is like a recipe without measurements. You can’t act on information you don’t understand.
How to fix it: Add a clear title that describes what you’re measuring. Label both axes with units. Include a baseline phase. Mark phase changes with a vertical line. Add notes for meaningful events like medication or setting changes.
If your chart needs a long explanation, add context to the chart—not a longer meeting.
Mistake #6: Mixing Up Counts, Rates, and Percent (Denominator Problems)
When you report totals without showing “out of how many,” you can mislead your audience.
What this looks like: Reporting total behaviors without noting how many sessions were observed, comparing weeks with different session counts, or using percent without showing the base number.
Why it matters: Decisions change when the base changes. Ten tantrums in a week with twenty sessions is very different from ten tantrums in a week with five sessions. The total is the same, but the rate per session is four times higher in the second week.
How to fix it: Show the denominator. Use rates when time or opportunity changes. Add a note explaining the base. Keep the measure consistent across charts.
When someone asks whether something is good or bad, make sure your chart answers “out of how many” first.
Mistake #7: Seeing a Trend That Isn’t There (Noise vs Signal)
Data naturally goes up and down. Not every high day is a breakthrough, and not every low day is a crisis. Overreacting to normal variation burns out staff and families.
What this looks like: Big decisions based on one high or low day, constant plan changes, calling something a trend after two data points.
Why it matters: You can chase the wrong problem. You might drop an intervention that was working or continue one that isn’t. The team loses trust in the data process.
How to fix it: Look for patterns over time, not single points. Use consistent time windows. Add markers for phase changes.
A simple team script: “Let’s look at the last few weeks, not just one day. What changed in the environment? Do we see this pattern again?”
Build a habit: no big decision from one data point unless safety requires it.
Mistake #8: Confusing “Related” With “Caused By” (Correlation vs Causation)
When two things change at the same time, it’s tempting to say one caused the other. But charts showing two things moving together don’t prove causation.
What this looks like: “This went up when we started the new intervention, so the intervention caused it”—without checking what else changed.
Why it matters: You may keep the wrong intervention or drop one that was working. You may miss the real cause.
How to fix it: List other changes happening at the same time. Use careful language like “these changes happened around the same time” or “we need more data to be sure.”
If you can’t explain the why, don’t label the chart as proof. Label it as a clue.
Mistake #9: Building for the Wrong Audience (Parents, Techs, Leaders)
A good chart fits the reader. A dashboard designed for a BCBA won’t work for a parent. A summary for leadership won’t help a technician know what to do today.
What this looks like: One dashboard for everyone, unclear next steps, too much jargon.
Why it matters: The right people don’t act, or they act the wrong way. A parent who doesn’t understand a chart may lose trust. A technician who can’t find key information may miss important changes.
How to fix it: Match the chart to the decision and the reader. Use plain labels. Include “what to do next” notes.
- Parents need a clear progress story and next steps—simple graphs, simple terminology.
- Frontline staff need to know what to do today and what counts as success.
- Leaders need system health information—capacity, staffing, authorization status.
Make one parent view and one team view. Don’t force one chart to do both jobs.
Pre-Share Checklist: 10 Things to Check Before You Send a Chart
Before you share any chart or dashboard, run through this checklist. It takes less than a minute.
- Purpose: What decision should this chart support?
- Chart type: Does it fit the question?
- Title: Does it explain what you’re looking at?
- Axes: Are they labeled with units and time range?
- Scale: Is it clear and not misleading?
- Baseline: Are the baseline and key changes marked?
- Color: Is it consistent, accessible, and not the only cue?
- Clutter: Does it show only what the audience needs?
- Context: Are definitions and denominators included?
- Takeaway: Is there a one-sentence summary and next step?
If you work with a team, have one person create the chart and a second person do a 30-second confusion test before sharing.
Frequently Asked Questions
What is the most common data visualization mistake?
Choosing a chart that doesn’t match the question. For example, using a pie chart for ten categories makes comparisons nearly impossible. Write the question first, then pick the chart type that answers it.
How do I choose the right chart type?
Start with the question. Comparing categories? Bar chart. Change over time? Line chart. Parts of a whole with two or three categories? Pie chart can work. Choose the simplest chart that answers your question.
Why are truncated axes considered misleading?
Changing where an axis starts changes how dramatic a change looks. A small improvement can look huge if you start the Y-axis at 90 instead of 0. Zooming in is okay if you clearly label it and explain why.
How can I make charts easier for parents?
Use plain titles and labels. Add units and define unfamiliar terms. Include a one-sentence takeaway and clear next step. If a parent can’t understand the chart without you in the room, revise.
What are common color mistakes in charts?
Using too many colors, low contrast, and relying on color alone to distinguish data series. Design so your chart works in grayscale by varying brightness, adding labels, and using patterns or line styles.
What is a common analytics mistake when interpreting charts?
Overreacting to one data point. Normal variation gets treated as a trend, leading to unnecessary plan changes. Before deciding, check the time window, the denominator, and what else changed.
How do I know if my dashboard has too much information?
Use the 10-second test. Show it to someone unfamiliar for ten seconds, then ask what’s most important. If they can’t answer, simplify.
Conclusion
Better data visualization isn’t about making prettier charts. It’s about making clearer decisions.
The mistakes in this guide are common because they’re easy to make. Wrong chart types, cluttered dashboards, misleading axes, color problems, missing context, denominator confusion, overreacting to noise, confusing correlation with causation, building for the wrong audience—these show up in clinics every day. But they’re also easy to fix once you know what to look for.
Start small. Pick one chart you shared recently. Run the checklist. Make two fixes. Notice how much clearer the communication becomes.
Charts support judgment. They don’t replace it. When you make that principle visible in every chart you share, you build trust with families, teams, and funders. You make better clinical decisions. And you protect the people who depend on you to get the data right.



