Why Most RBT Training Fails: The Missing Instructional Design Component
Picture this: An RBT® completes their 40-hour training, passes the exam, and shows up for their first day ready to work with clients. But within hours, it becomes clear that something's missing. The terminology is there, but the genuine preparedness isn't. The supervisor realizes they're essentially starting from scratch.
If this scenario sounds familiar, you're not alone. According to Thomas Freeman, Senior Vice President at ABA Technologies and a behavior analyst with over four decades of experience, the problem isn't that RBTs need more training hours; it's that many training programs are lacking a critical component: sound instructional design.
I recently sat down with Tom to discuss why so many RBT training programs fall short and what we can do about it. Tom has extensive experience developing coursework on behavior analytic approaches to instructional design, and his insights reveal an uncomfortable irony: We're behavior analysts teaching future behavior analysts, yet many of our training programs fail to utilize the very teaching procedures our science has empirically demonstrated as most effective.
What We've Known Since the 1970s
Tom's background in instructional design traces back to work he did with our founder, José Martinez-Diaz, and Dr. Kristin Myers-Kemp, developing coursework specifically focused on behavior analytic approaches to teaching. "Since the beginning of providing our coursework, we have covered various effective teaching methods developed using the behavior analytic approach to instructional design," Tom explained. This wasn't just theoretical work; it was grounded in some of the most compelling educational research ever conducted.
He pointed me toward Project Follow Through, which began in 1968 and stands as the most extensive educational experiment ever conducted. The study included over 200,000 children across 178 communities and compared 22 different instructional models. The findings were clear: Direct Instruction and behavior analysis emerged as the most effective methods for teaching language and math, and direct instruction was itself based in the science of behavior analysis.
"As early as the 70s, Direct Instruction has been identified as one of the most effective instructional methods," Tom noted. The evidence has been clear for decades. In addition, Skinner’s Programmed Instruction and Keller’s Personalized System of Instruction have produced equally impressive learning outcomes. So why are so many failing to apply these proven principles to our own field's training? Tom identified the limitations of certain learning platforms as one potential issue. But there are more.
Three Critical Design Flaws
When I asked Tom about the most common instructional design flaws he sees in typical RBT training programs, he didn't hesitate. He identified three major issues that plague most programs, particularly those delivered online.
Flaw #1: Passive Learning Instead of Active Engagement
The first and perhaps most significant problem is the lack of interaction between teachers and students. "This can be especially true of online training, but we even see this problem in some live on-site classroom instruction". Tom noted that many programs essentially deliver information in one direction without meaningful engagement.
The solution? Active Student Responses (ASRs) throughout the presentation. "We should be constantly pausing sessions to evaluate how students are getting the material," Tom explained. This type of student feedback on their mastery of the material that is provided in real time is often missing from online training programs.
Flaw #2: Assessment Only at the Finish Line
This leads us to the second major flaw: assessments that only appear at the end of training. Think about the logic here. If you wait until the end of a 40-hour training to discover that a learner didn't understand material from hour three, you've missed your window to address it effectively.
José’s approach to instructional design, which was operationalized by Tom and Kristin and later with help from Adam Hockman, along with a whole team of skilled subject matter experts (SMEs) and instructional designers, involves constant checkpoints, gathering data on students’ ongoing performance as they progress through the material. By the time the final assessment is presented at the end of the course, there are no surprises.
Flaw #3: The Misalignment Problem
The third flaw Tom identified was particularly frustrating: testing on things that aren't taught. This might sound absurd, but it happens more often than you'd think when learning objectives aren't clearly defined and instruction isn't directly aligned with those objectives. Tom told a story of an instructor in this material who provided a presentation on the importance of teaching by objectives, but the presentation did not actually include any learning objectives within the instructional materials. “Old habits die hard. And honestly, we all need the help and feedback from others to keep us on the right track when it comes to best practice, not only in teaching but in any intervention designed to change human behavior.”
Tom also noted the need for more opportunities for deeper learning, meaningful discussions, and study questions to enhance learning from readings and videos. José and the team made sure students were always provided these extra opportunities for discussion and interaction, not only with course instructors but with each other. When instructional design is done well, every component works together with purpose.
The Real-World Ripple Effect
So what happens when RBTs complete training programs that lack these fundamental instructional design principles? Tom's answer was sobering and illuminating.
"RBT training is the didactic piece that becomes meaningless without effective supervision," he explained. But before you misunderstand, he's not saying training doesn't matter. He's saying that “poorly designed training can fail to set a firm foundation upon which a skilled supervisor can build the real-world skills required by an effective and ethical RBT who will seek continual growth and improvement.” That's the goal here. Tom used a perfect analogy: "It's like watching a video on baseball and thinking you can now just go out and hit a curveball." RBT training introduces behavior analysis terminology, but without sound instructional design principles, "it does not adequately prepare individuals for client work without practical application."
He shared an anecdote about someone he met who had completed online RBT training without any specific memory of what he had seen. The training had allowed him to attend to everything around him except the training! It simply wasn't designed to ensure learning actually occurred.
This creates a cascade of problems. For RBTs, it means entering the field feeling underprepared and overwhelmed. For supervisors, it dramatically increases their burden; they are unsure of the gaps that exist or what their supervisees have actually learned. Ultimately, for clients and their families, inadequate preparation affects the quality of care.
When I asked Tom about the upcoming 2026 requirement for Behavior Skills Training (BST) in RBT programs, he was cautiously optimistic. "It depends on how it's done," he noted. BST involves instruction, modeling, rehearsal, and feedback, and it can be effectively done online if structured properly. But the key is thoughtful implementation. Simply checking a box won't solve the underlying instructional design problems.
What Quality Instructional Design Actually Looks Like
So what should quality RBT training look like? Tom outlined several core principles that emerged from decades of research and his own experience developing instructional content:
Active Student Responses throughout the training: Constant engagement, not passive consumption. Learners should be responding, practicing, and demonstrating mastery at every stage. Ideally, exercises are encouraged to be done outside of class to promote fluency.
Ongoing assessment: Checking understanding in real-time through the training, not just at the end. This allows instructors to identify and address gaps in the student’s skill set immediately.
Clear objectives with aligned teaching: Every assessment item should map directly to what was explicitly taught. Under José’s mentorship, Tom, Kristin, and the curriculum team modeled their approach to assessments in part on the BACB’s use of item analyses and revision or removal of questions that more than 50% of learners got wrong, ensuring assessments were fair and aligned.
Iterative improvement: Instructional design isn't static. It must evolve based on feedback from learners. What works? What doesn't? The design should continuously improve based on data.
Additional essential elements: Guided notes, structured repetition, meaningful teacher-student interaction—all working together to create a comprehensive learning experience.
Taking Action: Where to Start
When I asked Tom what training providers could implement immediately to improve their programs, his advice was practical and actionable.
First, ensure learners are actually attending to the training. Build in mechanisms that require engagement, not just exposure. If someone can "complete" your training while doing laundry and checking their phone, your instructional design needs work.
Second, equip supervisors with information about what their supervisees actually learned. Tom emphasized providing supervisors with detailed syllabi showing what was taught, clear learning objectives, and ideally, information about how their supervisees performed. At ABA Technologies, we've taken this a step further; supervisors can access their RBTs' training directly, giving them complete transparency into exactly what material was covered and how their supervisee engaged with it.
Third, implement ongoing assessment throughout the training. Move beyond end-of-course exams to frequent check-ins that catch misunderstandings before they compound.
The bigger picture involves using item analyses to identify and strengthen flawed questions, gathering and systematically responding to learner feedback, and recognizing that instructional design should evolve based on data, the hallmark of the behavior analytic evidence-based approach.
The Foundation Matters
Here's what I took away from my conversation with Tom: The problem isn't that training alone is insufficient; it's that poorly designed training leaves RBTs unprepared to benefit from supervision. Quality instructional design doesn't replace supervision; it increases the likelihood that supervision will be both more efficient and effective.
We have decades of evidence about what works in instruction. Project Follow Through gave us many useful answers back in the 1970s. Behavior analysts and instructional designers have refined these principles ever since. It's time for our field's training to reflect the best practices of our own science.
When RBT training incorporates sound instructional design principles, active engagement, ongoing assessment, clear objectives, and aligned instruction, it creates RBTs who enter supervision ready to learn and grow. They have a genuine foundation, not just familiarity with terminology.
Whether you're a training provider or a supervisor, you have the power to demand and create better-designed training. Because at the end of the day, this isn't just about checking boxes or meeting requirements. It's about preparing the people who will work directly with our clients to actually succeed in that role.
That's a mission worth the extra effort of doing instructional design right.