Where is the Analysis?

Friday, March 27, 2020

In a recent blog post by Kristin Smith, M.Ed., BCBA, LBA, we evaluated three levels of data: micro, meta, and macro. And while we collect microdata daily as clinicians, let us not ignore the importance of analyzing these data! Sure, the exciting analyses are in the meta and macro data – but we still need to show love to the microdata. It needs to be tended to, not ignored. And when clinicians are faced with the push-pull demands of “the job”, it is easy to not tend to the analysis of our microdata.

To ensure you are effectively analyzing the microdata, see if you can check off these four rules:

1) You have a protocol on how often the supervisors (e.g., Board Certified Behavior Analyst, consultants) evaluate the micro/programming data.

Generally speaking, this should be evaluated as often as the BCBA is supervising the case (e.g., weekly). If the frequency is less often than weekly, the second best thing is simply having a protocol and sticking to it. The last thing you want is for the microdata to go over-looked to where the BCBA is not evaluating trends and outcomes. This is linked up with rule #2.

2) There is a system for direct-staff to alert supervisors that a change is needed – and supervisors make that change immediately (or grant direct-staff the authority to do so).

Registered Behavior Technicians (RBTs) need to evaluate microdata daily, within every session, and to make a determination on whether to alert their supervisor if the program might need an intervention, an additional prompt, a faded prompt, a priming, a new target, a change in the reinforcer schedule, or similar needed changes to the intervention. If direct-care staff (e.g., RBTs) are unable to make some level of changes to the microdata, or there isn’t a system to alert and bring in the consultant before the scheduled visit to make an immediate change, the fastest change in the microdata will be the frequency of the consultant’s visit. Do you notice the concern here? What if the learner is making rapid gains? Then the program is unable to advance at the pace of the learner. What if the learner is struggling? Then an ineffective procedure continues to be administered – or at best, the program is on hold until a proper change is made. By utilizing a protocol where the consultant responds to alerts that a change is needed, the learner’s progress can be expedited and it encourages more eyes on the data!

3) Make a list of unique data-based rules that are learner specific, deducted from trends.

Some learners may have unique rules that are derived from their data. For example, one of my favorite little learner’s scores would plummet if you were loud and energetic, which is surprising as that is typically how therapists are trained to act with 3 year-olds. However, this learner’s frequencies would only respond to calming and nonchalant celebrations.

Another data-based rule could be regarding mastery criteria. In another instance, I had a learner whose southern draw was so thick that it slowed down his pace for various see-say programs. His frequency aim to master see-say programs was about 10 counts/min less than his peers; yet, his performance still produced application and endurance.

These are just some examples of nuanced circumstances of how data will inform you to derive learner-specific rules that won’t be applicable to everyone. But these rules will be applicable to carry out for your learner until the data suggest otherwise.

4) You have a system of measuring how many decisions are made regarding your client’s data.

“What gets measured gets managed,” (Drucker, 1954). If we want people to make frequent decisions about data, we need to take data on decisions. This could be decisions at any clinical level – RBT, BCaBA, BCBA, parent – but they are decisions about the data.

There are 3 types of decisions that one can make: 1) stay and continue the course because the data look good; 2) stop because either the programming is not appropriate for the learner entirely or because it is now mastered; 3) change to either make the program harder or easier for the learner (or to make a stimulus, antecedent, or consequence change) (Kubina, 2019). These decisions can be made during team meetings with agreement (you can track agreements/disagreements), or during periods when the supervisor is providing demonstration and direction. They can also be independent decisions. CR PrecisionX actually has a nifty feature that tracks if decisions were made and by whom!!

A graph depicting the different types of decisions made about the learner’s microdata can be quite telling as well. You can evaluate if the recent uptick in challenging behavior was due to too many “change to make harder” decisions or “stop and add” decisions. Or, conversely, you can monitor how challenging behavior maintained at low levels even with these changes occurring, which is a significant indirect outcome of the intervention.

In a recent study, Dr. Rick Kubina worked with therapists to improve their decisions made per week. The control group made 52 decisions per week about their clients’ data. The experimental group made on average 437 decisions per week about their clients’ data. This increase in attention to analysis and decision making resulted in significant increases in the post-scores on the Vineland Adaptive Behavior Scales, Third Edition (Kubina & Sullens, in press).

These four considerations, while simple, tend to get lost during the shuffle. When we are not ever-present to the waxes and wanes of the microdata by making frequent changes to help promote success for our learners, then we have graphs that linger too long or go untouched with stagnant learners. We need to do more than just intervene – we need to intervene with purpose, and that purpose comes from our active and frequent analysis!

References

About the Author

Dr. Kerri Milyko, BCBA-D, LBA
CentralReach Director of Clinical Programming

As a graduate of the University of Florida and the University of Nevada, Reno, Dr. Kerri Milyko and her husband, Ethan, moved to Tampa, FL to open up Precision Teaching Learning Center in Tampa, FL. Precision TLC serves children of all ages with and without diagnoses and consults with local private schools for their exceptional students.

Through Precision TLC, Kerri is the Director of Development and Outreach in an effort to disseminate “Practical PT” for parents, teachers, and practitioners by providing consultation for training, curriculum, and intervention. She also has started another business venture with friends and business partners in Reno, NV: The Learning Consultants. Here, her role is Director of Research and Development where her team provides behavioral services through precision teaching to children with skill deficits and various diagnoses in home and clinic-based setting. They also provide business to business consultation on system-wide assessment and implementation to transform agencies who want to adopt precision teaching.

In Dr. Milyko’s effort to seek all possible reinforcers, she also teaches ABA-graduate classes at the University of West Florida and assists the director in designing courses to incorporate more PT-based, instructional design content in application and concept learning. She continues to be involved in research by supervising and contributing to masters’ theses and collaborative projects incorporating PT to unique areas of education and psychology. In 2019, she was also elected to serve 3 years on the Board of Directors for the Standard Celeration Society, and appointed by the governor of Nevada to serve on the first-ever Board of Applied Behavior Analysts to create ABA practice regulations for the state. Personally, Kerri values quality time with her three children, her husband, and dear friends. She loves wine and butter, true crime podcasts, and a good sci-fi novel while tinkering in her backyard.