Change Agents and Implementation Science with Crystle Alonzo and Rouzana Komesidou

00:11 Tiffany Hogan: Welcome to See Hear Speak Podcast Episode 12. In this Episode I talk with Crystle Alonzo and Rouzana Komesidou about change agents and implementation science. We discuss research on the diffusion of ideas and concepts, how your colleagues’ orientation to change effects your ability to mobilize change, and implementation science (what is it? How does it influence implementation of evidence-based practice? And what’s your role in it?)
This conversation is part of a series on leading literacy change that I have created for a course I teach online at the MGH Institute of Health Professions in Boston.
Thank you for listening! And don’t forget to check out www.seehearspeakpodcast.com to sign up for email alerts for new episodes and content, read a transcript of this podcast, access articles and resources that we discussed, and find more information about our guests. Also don’t forget to subscribe to the podcast in apple podcast or wherever you are listening.

01:23 Tiffany Hogan: Welcome to SeeHearSpeak Podcast. Today, I have with me Rouzana Komesidou and Crystle Alonzo and we will be discussing how to mobilize change and implementation science. And I will have them each start by introducing themselves.

01:40 Rouzana Komesidou: So, hi, thank you for having me today. My name is Rouzana Komesidou and I'm a postdoc fellow here at the MGH Institute of Health Professions. I work at the SAiL Literacy Lab with Dr. Tiffany Hogan on a longitudinal study that follows up children from kindergarten to second grade and examines their language and reading development. And I am also very much interested in implementation science which brought me here today.

02:08 Crystle Alonzo: Hello, I'm Crystle Alonzo and I'm excited to be here talking about these two topics that I find very interesting and has been thinking about for many years. I am currently a postdoctoral research fellow at the University of Montana working with Dr. Julie Wolter, also as a Project Director like Rouzana on this longitudinal study looking at children with developmental language disorder and potentially come up with dyslexia.

02:37 TH: Crystle, when you spoke to my class on Leading Literacy Change for several years, the title of your talk was Finding Your Place on the Map to Effectively Mobilize Change. And as part of that lecture, you brought in the, what was it called, the Diffusion of Innovation Model. So can you start out by telling us what is that model?

02:59 CA: Sure. So it's actually a study of research of diffusion of ideas and concepts and it actually spans many fields such as public health and medicine, as well as sociology and education. And it's been around since the 1950s. And if anybody is interested, there is a book called the Diffusion of Innovations written by Everett Rogers. And although it is large, it's actually very readable and has lots of real-world examples. So I would definitely encourage people who are interested in the topic to go forth and look that up as well.

03:32 CA: But the diffusion of innovation, at its most basic core, can be defined as the process by which an innovation is communicated through certain channels over time among the members of a social system. So it's a mouthful there but, more importantly, I think we need to know what an innovation is. And so an innovation is any idea, object, or practice that is perceived as new by members of the social system. And so I think this is really relevant to those of us in Communication Sciences and Disorders, both as researchers or clinicians, to think about the ideas or the clinical practices that we decide to use and learn about in our everyday practice.

04:13 CA: And then, I think the communication by which that innovation is communicated to us is really important too because we have things like mass media communication and, actually, that includes things like radio or television, or newspapers. So for those of us in our clinical field, those are our research articles that we're reading in journals to try to learn about new innovations in our field. But then we also have those interpersonal communication channels too, right? So our peers, the face-to-face exchange that we have. And, actually, I would also say that when we're taking course work as students, it includes that interpersonal communication as well.

04:49 CA: And then, finally, the interactive communication channel about how we learn about new innovations. And that really truly is the internet. And in our field, specifically, that's exploded recently in terms of social media and Facebook groups, and blogs, and podcasts such as this one. So I think that that is certainly how we are communicating and learning, as well as telling others of these innovations. And then when we think about time, there is a process. There's an innovation decision process as to how we think about innovations and our decision as to whether or not we're going to adopt them and use them. So I'll remind you again that when I'm talking about innovations, those are practices, interventions, curriculums, assessments that we decide to use in our everyday practice. And so, that is certainly something that's gonna happen over time, our decision to use something or not, to abandon it.

05:42 TH: So this model is a way for us, as a field, to think about what's the uptake of new information and how is it adapted to the situations that we're in, like you mentioned, like a new intervention. Or let's say a new test is published and we're trying to make that decision, do we use it or not. And for this course, thinking about leading change, it's really less about your individual decision maybe to choose to use a specific a test or intervention and it's more about how you are a leader in the process of creating change, mobilizing change, as you said in the title, and so it's thinking about the factors involved. But I just mentioned that there's these individual desires to change or openness, possibly, to change and you're thinking about your own self. Like, "For my practice, I'm choosing the specific test." That's under your decision-making.

06:33 CA: Correct.

06:33 TH: Process right? But what about thinking about making change, how does a person's individual openness, their responsiveness to innovation and change, impact a leader's ability to make change at a system-wide level?

06:50 CA: Yes, I think that's a great segway into that last piece of the diffusion of innovation definition which was looking at it through social systems. So the fact that we are not working in silos even as an SLP perhaps in a school, you might be that one SLP in that school but you're working within a social system. And so there really are key players that we need to think about and I know that there has been some discussion in here with the leading literacy change course where we talk about change agents. And so how we're really trying to empower clinicians to feel like they're the change agents in their schools, to lead the charge in bringing forth evidence-based practice related to literacy and language. And there are other key players that we need to consider as well. So there are people that we would also call opinion leaders. And so those are people who are able to influence other's attitudes, or they have overt behaviors that informally kind of move the needle in a desired way. And they kind of do it very frequently, so those are the kind of people you look to.

07:55 CA: So when you think about yourself in your social system, most likely in a professional setting, you think about who are the people who are kind of the ones who are coming forth with new ideas, who are consistently... And that might be you. You might be thinking, that's me. And so who you're trying to constantly tell people of new things you've heard about. There are also those gatekeepers because we work within systems where we... Like Dr. Hogan said, we're not always able to make those changes or decisions on our own. Usually there are others in the system who are the decision makers because they usually control access to information or critical resources. So in schools we have to have these conversations with our administration to be able to implement something more system-wide. And then... Yeah, so those are kind of our main key players I think that we wanna think about. And then you also have to think about the rest of the groups. So we have what's called adopter categories, and some of you may have heard of this because actually this idea of diffusion of innovation has also kind of taken over business and marketing, and how innovations such as technology like cell phones or smart... Oh, what are those things called... Like Alexa's and things like that.

09:10 TH: Oh yeah. Yeah.

09:11 CA: Smart home gadgets and those things. How did these actually get to market and kind of explode. So in marketing, they think about these too. And when they think about these different categories, they think about five different people, essentially, and how they fit into these categories. So the first are the innovators and these are usually the people who are visionaries or risk-takers, they're very venturesome, and they're eager to generate as well as to try new ideas. And that's really only about 2.5% of people. And then we have early adopters, which are respected knowledgeable leaders, so those opinion leaders that I was talking about earlier that we might find in our professional systems. And they're usually open to adopting new ideas and that's about 13.5% of the people. And usually, if you get these two categories that's the tipping point that you might hear in business terms of when something might saturate the market. If you can get those guys on board we might get to then having more people to adopt this innovation. So then you have the early majority, and that's about 34% of people and that's usually a follower rather than a leader but they're very deliberate in their adoption decision. They're not just gonna do it just because it's the newest fad that's out there. They're gonna consider that process before they make their decision.

10:25 CA: And then you have the late majority which is the other 34%. And that's... They're skeptical, they're cautious about adopting a new idea but they're motivated to adopt because most likely peer pressure. These are the people who, for example, now have an iPhone, not because they really wanted it but because it's ubiquitous. [chuckle] And so they had to adopt and get on the smart phone train. And then lastly, you have the 16% which we call laggards. And so they're traditional and suspicious of new ideas and they're usually the last to adopt an innovation. And I think what's really important about these adopter categories is it is your normal bell curve for those of you are picturing it right now but you have different adopter categories for different concepts. So you might be an early adopter for technology because you love technology but you might not be an early adopter for the latest fashion trends, for example. So it can vary depending on what you prioritize, what your needs are and what's most important to you in terms of your values and beliefs.

11:24 CA: And so when it comes to curriculum, if you've for example worked in a school for a long time you might be a little... And you've seen lots of trends in curriculum come through multiple times and fail and a new one come the next day then you might be a little bit more hesitant. You're maybe more in the late majority not really so eager to try the latest hottest thing because you've seen things fail before. And so you just kinda... Although I want people to think about these categories, we shouldn't try to put everybody in a box because it could just really depend on the context and the experience of people.

11:58 TH: It is interesting to think though... I can imagine that you are a different category according to, as you mentioned, maybe your context of technology versus fashion versus maybe your... In the school system. But one thing you did point out though, too, is that you as a person could change even in a specific context.

12:18 CA: Yes.

12:19 TH: You could change over time...

12:20 CA: Yes.

12:20 TH: Based on your experiences. So maybe initially you would be an innovator and a risk taker and an early adopter and then later as you become... Experience more trends you might become more of the late majority or more skeptical over time.

12:35 CA: That's very true.

12:35 TH: So it's context and you as a person changing. So as a leader you think about the community you're trying to lead. So if you're a reading specialist and you want to adapt a new evidence-based practice and you feel confident it's the right decision. Now your job is to get everyone else on board. So in doing so, thinking about your school, the stakeholders that you're involved with, the teachers, the administrators, the parents, everyone who is going to be a part of this change, it's helpful to think about the category they might fall into.

13:12 CA: Right. Yes. And I think that's why finding yourself on the map is really good. So who are you in the context of where you're trying to make this diffusion happen and who are those around you, actually. And whether they're going to be on board with you or ones that you'll have to work with a little bit more. But also getting to know how they'll probably approach the innovation decision process because there are stages to how somebody goes ahead and then decides if they're going to actually implement. And truly implementation, it's not the end goal; it's actually the sustainability, the confirmation that they'll continue using it even after implementing and not discontinue. So yeah, you need to try to figure out how everyone kind of falls also within those stages. Some people will be quicker to make those decisions on whether they're going to do adopt something or not, and some might need a lot of time and a lot of consulting and encouragement. And so I do think context here is key actually, for trying to get them... For trying to be that change agent in your school as a reading specialist or as a speech language pathologist. If you're trying to get a seat at the table for the literacy team, which I think often sometimes... I know for me when I was a clinician in schools that was something I had to do, was try to show why I should be a part of the team and a change agent in this specific realm.

14:36 TH: So, you talked about stages, what are the stages of innovation decision process that people are going through with the stages to confirmation?

14:43 CA: Great. Well, so there are five key stages and the first, which I actually think is a really interesting one, I look at these concepts both as a clinician, both as a professor who would be training future clinicians, but also as a researcher who wants to work with clinicians. And I think that these stages can actually look different for those different categories. But so, the first is the knowledge awareness stage. So it's becoming aware that a new innovation product or practice exists. So I'm gonna go over all the stages and then I'm gonna delve into a couple a little bit more. The second one is the persuasion stage and it's usually when you're trying to develop an interest, and usually you also develop an attitude towards whatever the innovation is. You haven't made a decision yet, but you start having a gut reaction to, "Is it favorable, is it not so favorable?" Do you want to learn more about it and trial it perhaps or are you already immediately kind of turned off by the new innovation?

15:40 CA: And then the third stage is the decision stage. And so this is where you're actually actively engaged in activities that are going to lead you to choose to adopt or to reject the innovation. And this might be more where we might see some trials for example, or more discussion perhaps with those who are trying to be the change agents and tell you more about the innovation for you to make the decision. The fourth one is the implementation stage. And this is typically when you are trying to say that you're going to use this on a much larger scale, so you'll not just trial a part of it, but you might actually try to implement the full assessment or intervention innovation in this case. And then finally, is this confirmation stage, which is where you're actually gonna decide to continue to adopt or you might be a later adopter. Perhaps you decided to reject it in the implementation stage, but you saw others around you who are implementing and they liked it. So now you're going to later on adopt it. Or you adopted it and decide you want to discontinue, it's not working for you, it no longer serves your needs. Or you're gonna continue to reject it even while you see other people around you using it.

16:48 CA: So those are the five main stages. But I think for me, some of the most powerful stages which is so very interesting are in the beginning, the knowledge awareness stage. Because for those of us who are actively engaged in research programs, we're very much at the stage of... Mostly trying to let people know about our innovations and that's usually through publications, right? So we're publishing our research and letting clinicians know about it. There's also CEUs or presentations at conferences or workshops to try to tell people about what intervention or assessment we've trialed in our research and now want them to go ahead and do. But that stage kind of tends to be at this beginning stage. It's really just more an awareness or a knowledge. We're telling them about them. We really aren't digging yet into the how-to knowledge, right? So that's really when we get a little bit more into perhaps the consulting with schools and clinicians on how to actually use this.

17:46 CA: So it's not just the "Here I gave you the instructions really quickly. Now you've read... " Because the appendix that has the actual materials, they're gonna go do it. But instead, this kind of continuous interaction that's happening between clinicians and the innovator, essentially the researcher, as well as principles knowledge too. So people actually have to understand what you're talking about in those papers. They have to know the theories, the methods behind it to truly grasp it. And we're hoping that their Master’s or graduate level certificate training is going to be enough principles or theories, theoretical foundation for them to understand what we're saying in these articles for them to go forth and actually implement. So there are a lot of stages there in terms of the awareness knowledge that we need to make sure our clinicians understand before they can actually move forward and use these with potential sustainability, right? Because otherwise, we're gonna lose them after they have tried to implement and it didn't work because they couldn't do it to the level that maybe what it was stated in our research articles.

18:50 TH: Yeah, I think that's really interesting to think about in the context of creating change. And I will say as a scientist, in terms of generating knowledge, I was taught to use the scientific method. So you have hypothesis, you have a theoretical framework, and then you conduct the study. And even if it's a treatment study, you conduct the treatment study to look at a causal effect, and you then show that there's this causal effect and there's... It kinda stops there a lot of times. So it's like, "Okay, I've shown that this is effective but there's not a sense of how this is going to get into practice, that... Filling that gap of the research to practice." So, what do you think people are doing, scientists in particular, to try to fill that gap now?

19:31 RK: I think one thing that we can do to fill that gap is be involved more in implementation science. And implementation science, what it does, it helps you to identify all those factors, whether they are facilitators or barriers that affect how the implementation process is done in an actual real-life setting, whether that is a school setting or a medical setting.

19:57 TH: Is implementation science something that is radically different than what treatment researchers are already doing? How would treatment researchers think about the implementation science field in relation to what they're doing now?

20:12 RK: I think it's one step forward from treatment research. If you think that treatment research is usually happening under very well-controlled conditions, the difference with something that is happening in real-life conditions is that there are so many interacting factors and systems that affect the quality of your work and whatever you're trying to implement in evidence-based practice. So I think it matters where you conduct the research and what do you involve in that research to examine the effectiveness of a program. So that would be, as I see it, the main difference between bringing it from the lab to the actual context. And that's where the implementation science frameworks can help us.

20:57 TH: Is implementation science something that... I've heard this said so I'm just getting your thoughts on this. So, is implementation science something that has to happen after you've done treatment research?

21:08 RK: No, it can actually happen alongside treatment research. We do have specific study designs that accommodate that; we call them hybrid designs. And what hybrid designs do is they actually look simultaneously effectiveness, research and implementation science in a real-life context.

21:30 TH: So if someone asked you, "What is implementation science? What's the primary goal, what types of studies really comprise implementation science?", what would you say?

21:39 RK: I think I would define implementation science as the scientific study of the context and the factors that affect implementation process. And the different study designs that exists under the implementation science frameworks are... We have... Well, I would start actually with the core of the implementation science. Qualitative research leads implementation science. And by qualitative research, I mean observations, interviews, focus groups, surveys. And this was actually very surprising for me, for someone like me and my background, because I do not have background in qualitative research. And it definitely pushes me to get training in qualitative research methodology, so I can be able to conduct that work. However, we also have mixed methods research that combines both qualitative and quantitative, and this is where the hybrid approaches come in. Because in hybrid designs, we have the quantitative aspect, which is assessing the effectiveness of an evidence-based program, and we have the qualitative aspect, where we're assessing the implementation process of that specific program.

22:52 TH: Are there different types of hybrid studies?

22:55 RK: Yes, there are three types of hybrid studies, and they differ based on the emphasis that is put on either effectiveness or implementation science or both. So we have hybrid design one, which focuses more on the effectiveness side and has implementation…collecting data on implementation science. We have hybrid design two, which puts equal emphasis on both aspects, and we have hybrid design three, which shifts the dynamics and you have more emphasis on the implementation science part while you're doing effectiveness studies.

23:33 TH: What do we know about implementation and the process of implementation based on past implementation science studies?

23:41 RK: So there's a statistic actually that shows up in many of the publications that mentions that it takes an average 17 years for 14% of evidence-based practice to be actually applied adequately in a real-life context. So that sounds a little bit shocking. [chuckle]

24:00 TH: Wow. You can't see this, listeners, but my jaw is on the ground right now and I'm disheartened.

24:05 RK: Yeah. So it depends where you are, in which field you are while doing implementation science. Implementation science has a strong grasp in medical field health professions, but unfortunately not so much in communication science and disorders. But we are working on it and things are happening and we're moving forward to bring more people to get interested in implementation science.

24:29 TH: I definitely hear about those initiatives and this moving forward especially in our field. But before I do that, Crystle, in a recent paper you have with Schliep and Morris, you provide an example of a hybrid three study of school-based literacy intervention. So can you describe that study for us so we can get a deeper sense of what a hybrid study looks like in practice?

24:51 CA: Yes, I'll just quickly go over it. But for more detail, I would definitely recommend the article because what we do is actually break down two different case studies: One, in adults in medical settings, and two, in pediatrics in a school-based setting. And it is loosely based on the language and reading research consortium work that was done by multiple sites, including Dr. Tiffany Hogan here at MGH Institute, University of Nebraska, where we went into schools and did a literacy-based... A language-based literacy intervention. And so, it actually was considered a randomized, controlled trial. And we went into multiple schools, taught teachers... I was a doctoral research fellow on that project during my PhD training, and so we went into schools, we trained teachers to use this curriculum and then went ahead and provided lots of measurements so that we could then see student outcomes on how effective the intervention was. But an addition to that and why this actually is a hybrid study, is that we also asked teachers about... So we have lots of surveys about fidelity, about social validity, how this actually would fit in the context of what they're doing every day, as well as whether they felt it was easy to implement or not.

26:16 CA: And so in reality, that study itself was more of a hybrid one, I would say. But the case study that we do for this paper is a hybrid three, and that it focuses a little bit more on the implementation aspect. So in this case, we would already know that the language-based literacy curriculum is already for the most part, effective in smaller trials that we've already done, but now we're trying to see if in the real-world context, it's actually able to be implemented or not. And so we used the RE-AIM framework because like Rouzana mentioned, there are several frameworks for implementation science. But we liked this one because RE-AIM is essentially an acronym for reach, effectiveness adoption, implementation and maintenance. And I just want to also mention, just because I love the synchronicity of these things, I feel like this framework very nicely kind of follows this diffusion of innovation principle as well that I was talking to you about in terms of the adoption, the innovation decision process.

27:18 CA: And so the first question that we want to know in terms of the effectiveness of this trial for a school-based literacy intervention is the reach. So how do I reach the target population? Is that targeted population actually being reached? And so you would measure how many school districts within the state agree or not agree to participate in the study, and then you would follow up, and that's what makes this more the hybrid three, more focused on implementation, you'd be following up with interviews with those school districts who said they did not want to participate as to why they didn't wanna participate. And to follow up with the schools and districts that decided that they did want to participate as to, again, why they chose to participate. So it is like Rouzana mentioned, involving that qualitative methods aspect of not just accepting or not accepting that people aren't participating, but then also trying to figure out the reasoning for why they are and aren't, because that could give us insight into barriers or facilitators for using this intervention.

28:17 CA: Then the second one is effectiveness in RE-AIM. So how do I know that my intervention is effective. And this is, again, pretty standard with what we know as researchers on how effectiveness actually works. And so, measuring in traditional senses, I think whether the intervention was able to be used or not. But then also asking, in addition to that, the effectiveness of the implementation of the intervention. So this is when you're asking who gave the intervention? Was it always given by a teacher? Or were para-educators used to give this intervention? This is when we're then looking at the fidelity or the potential for re-invention, too, because perhaps they were using the intervention in a different way than was prescribed by the researchers.

29:02 CA: And the third one is how do I... Is adoption, and this is how do I develop organizational support to deliver my intervention? And so, here you're assessing the adoption trends of the intervention by looking at the uptake in the various schools. So was there attrition? Were there are attrition rates? Do people say they wanted to adopt it and then discontinued using it? You would also want to have observations in the classroom, again, to assess if the intervention was being used and to what level of fidelity. And then the fourth part of RE-AIM is implementation. So here is how do I ensure that this intervention is being delivered properly. So you're assessing the implementation in an actual classroom during under real world context, and so you're collecting data on various time points, conducting those interviews with focus groups, again, to determine the barriers and facilitators, so this is workload time is increasing by having to implement this new intervention. Are they getting the support they need from administration to be able to do this new intervention?

30:01 CA: And then finally, maintenance. And I think this is the key that maybe a lot of people kind of forget about, this last part. [chuckle] So how do I incorporate the intervention so it's delivered over the long term? I think the goal for all of us sitting around this table right now is that we don't just want people to adopt these innovations and interventions and assessments or just general clinical practices; we want them to actually be sustainable. We want it to be over the long-term. And so this is a key ingredient for trying to figure out, "Well, what will make that maintenance occur?" So you want to assess the maintenance of the intervention effects and the implementation in the classroom, perhaps six or 12 months after you've actually done your study, and how is it now being used. Because, again, this is the potential where there could be a reinvention. Now that they're not being as closely consulted or interacting with researchers, they could be changing the intervention from what it was originally designed to be, which could be good or bad, and is worth still the discussion and the noting when trying to figure out if this is a sustainable innovation for somebody to be using.

31:08 RK: And I think the point on maintenance or sustainability is a great one and often missed because although it appears under in different frameworks, it always shows up at the end. It's something that we always need to think from the beginning and work on that from the beginning during each stage while we move forward with our implementation efforts.

31:28 TH: Yeah, I wanna tell you about my experience on this project, because I was trained, I would say, more classically in treatment research in the little training I had in treatment research, so I didn't have that much training in it. But when I did learn about it and have some training, it was more traditional in the sense of looking at a cause and effect relationship. Working on the LARRC project was so informative for me, and it showed me how you have to have this idea of sustainability, and what we call scale up, or getting it to out in a broader, much broader concept text. You have to have that mindset at the very beginning of your study.

32:07 TH: And oftentimes... And when I think about the way I was trained, I had this sense that you work on this cause-effect relationship, and then you... So you've isolated, you say, "Okay, this is a malleable factor. I'm going to be able to change it in a treatment setting, and then I wanna start with the most sterile treatment setting, and I wanna show that effect, and then I'm gonna go further and further from the sterile, I guess." But in doing that, the mindset is really about... The purpose is really about that cause-effect relationship when you have that approach. When I worked on LARRC, from the very beginning, we thought... From the very beginning of even developing the intervention, we wanted to have an intervention that would be used in the schools that would be sustained, and so the way we structured our grant and our project was around that.

32:54 TH: So when we, for instance, initially started looking in the literature for malleable factors to create this intervention, we use teacher focus groups right from day one to say, "What's happening right now in the classroom," it's not always just gonna be in the literature. Seriously I think I was gonna read all these studies, that all of these studies are gonna to tell me what to do, what works to increase for, in this example, language comprehension. But actually, people are doing this every single day in the classroom, and they're having some effects and that's great, and we hope they are.

33:25 TH: So we would go into the classroom and we would look at what's called business as usual. So that's what's happening in the classroom now, and we would talk to teachers, we'd see what's happening, we'll say, "What's effective?" So, it's a marriage of what we see in the literature with what we saw in the classroom from minute one that helped us to start to create the treatment plan, the scope and sequence of the treatment. We... Basic decisions that you think about are driven, when you're doing treatment research in traditional way, are really driven by convenience and what is already set up. So if you're doing a treatment study and let's say you're working in the clinic, the kids can only come two days a week. So I'm gonna make the treatment two days a week, 45 minutes. Well, that's just not really what's happening in the schools and that's a big problem, I think, that's happening in the research.

34:11 TH: And so it's like that creates this even further gap because then it's like, "I published a research site; it showed something very effective." But I saw the kid one-on-on, 45 minutes, two times a week for a year and then the clinicians left to say, "Okay, well, how am I gonna implement this and what are the key effective aspects? What's the critical factors associated with the implementation?" So what we try to do is very early on, again, evoke all the stakeholders, the administrators, the teachers, look at the context of the school, tell us and make decisions. And here's an example of decision. We decided we'd make the curriculum be a year. That was something we wanted and, frankly, had to do with budget. But we had a lot of decision points within that year. So how many times were we administering this per week?

34:52 TH: One decision we made, we initially wanted to do it every day of the year. It's a curriculum. We talk to the schools. What we realized very early on is that was unrealistic. So what we did is we said we have four lessons a week which gives one day where, inevitably, there's a field trip, there's a fire alarm, there's other priorities and so that you could fit in that room. The other adaptation we made is that we had four units to span the year and we realized very early on that the last unit had to be shorter. So the last unit, instead of being seven weeks was only four weeks because there's end-of-year testing. So we couldn't fit it in to the whole year. So these are decisions we made right up front to try to give this intervention a better shot at being adapted and sustained within the schools. We also thought a lot about the materials we used and how attainable those materials were. We tried to think about schools that had lots of resources, schools that had little resources. We tried to think about, for instance, in the teacher training model to train up on the intervention, we created an online module because we realized that there could be schools in the future that aren't able to come. And that takes a lot of person powers to do it.

36:05 TH: And I think that's another aspect that we haven't talked about as much but it's critical is the funding issue. So when you're thinking about scale-up, we could create an amazing intervention that takes all these resources, it has a 1 on 1 training to get the intervention to fidelity. But that's not gonna work in the future because then the grant funding runs out and the scientist, to maintain their research productivity, to keep their jobs, continue on to do other grants. And that project is often left unless they have more funding. And that takes time. And so, I think for me, to see how this played out in a real-life context was so informative. And I left because I didn't realize I was doing implementation science until I learned more about what it was. And I thought, "Oh, I am doing that. I did that."

36:52 TH: The other thing we did was we had an idea from the literature and from our own philosophical views of education that the lessons we created would not be scripted. We wanted to empower teachers, train them to have the flexibility to implement these lessons, and we got so much pushback from the teachers, and they wanted scripting. And it really felt dirty to us. Like, "Why are we going to tell you exactly what to say?" What we learned is that that was part of the scaffolding. We should have known. Teachers needed that scaffolding. They wanted scripting and then they could go off script a bit. But that scripting was need initially, especially.

37:28 TH: And then the other thing we did is we looked at fidelity and what were the key components that were given. And we found something quite interesting early on is that we had five components of the lesson and we had an open hook, kinda hooking kids in. We had disclosed to wrap it up. In the middle, we had this gradual release responsibility "I do, we do, you do" common approach used in schools. And what we learned is that if... Inevitably, it was 30-minute lesson but sometimes it would take longer. And what we learned is that if the teachers that would just skip ahead and go to the close, those outcomes for children were more effective than if they just stop abruptly. And these are the things that we learned by allowing some flexibility and talking to teachers and having those focus groups. And so I have more of a sense of the idea that scientists have to change the way they think about treatment research right up front. It's not... I thought implementation science was what you do at the end when I first thought about it, before I had this experience.

38:30 CA: And I think that that is actually correct. When you look at the traditional pipeline for research, it does go clinical effectiveness trials and then implementation science research. And, actually, it's this idea of this hybrid that is actually trying to bring them together to happen at the exact same time/also thinking about it beforehand. Right? So not just waiting until the end. And I think that that is the key ingredient here because of the fact that, I think personally, that it's just a very hard pill to swallow that we think that if we create these wonderful interventions and then go tell clinicians about them that they're just going to be able to implement them to fidelity forever and always. I think that the context of...

39:17 CA: The context of being in a school, inherently, are going to be different. It's such a heterogeneous population. There's no way we can expect that the intervention, even if I did do it in a school, is going to be the same at school B down the street. It'll be a completely different context with different barriers and facilitators. And so there has to be this flexibility, this agility to these interventions or this potential for reinvention of these interventions and assessments because there's no way that it'll be able to just completely be a copy-to-copy from each school-to-school and that it'll work because everyone will have different contexts.

39:57 RK: Yeah, and I completely agree with you, Crystle. I think we've seen that this year, especially in the past year, with our work here with some of our local schools. We seek partners for our research programs and after being involved in several discussions with them, we did realize that we can not just go in and take a specific program practice and tell them, "This is what it is. This is how you apply it." We have to sit down with them and understand their situations because we've been working with some of the most poor schools in the district with high ELL population. So they have different things that they go through. They have different barriers and different facilitators. So that was, I think, a very good eye-opening experience for us because then, we had to step back and say, "Okay, how can we help you to take this evidence-based program and adapt it to your own context which goes back to the reinvention piece that you mentioned, and benefit students and the school staff.

41:13 RK: So it is definitely something that we always need to keep in mind no matter how much evidence and how much research you have about a specific program and practice, you don't actually really know how it works if you don't go to that context and apply it. And your one school could be different from the other school and from the other school. Your one clinical practice setting could be different from the other one. So you need to be open and ready to re-invent and be patient because it takes time to understand that context and apply that specific program to that specific context. So specificity, it's a key component here and we need to remember that.

41:56 TH: I've been thinking a lot about myself and my own self-assessment where I fit on the adaptor characteristics. And I think as scientists, we have to think about what's our training and we tend to wanna gravitate to what makes us feel very comfortable.

42:16 CA: Yes.

42:16 TH: And we're comfortable in the scientific method, we've trained on it. We're comfortable writing research articles, we're comfortable writing grants that really test certain hypotheses. And that's taken so many years and we really hone a practice in that way. And then we also become enculturated to what gets us moving forward in our careers. And so, we have different currency than what is happening in the schools. So our currency as scientists is we have to publish articles, we have to find our doctoral students, we have to present at conferences, we have to get grants. This is what we do to maintain our careers and we balance that with service and teaching. And so that's what we do on daily basis. But to contrast that with schools, if we're working on school-based intervention. So what they focus on a daily basis has a lot to do with school culture, understanding the State and Federal mandates. At the heart of it, of course, is improving student outcomes and developing children that are meeting their highest potential.

43:20 TH: And so they have different pressures compared to the scientists. And so it's easy to just separate yourself and say, "Well, I'm gonna do the science and that's what I'm gonna do." And I remember when I was a doc student, I asked someone in the field who had really a very scientifically effective program. I said, "I was just so surprised that people are doing this in the school. Have you thought about doing school-based studies?" And she's only doing clinic studies and she said, "You know, that's not my job. My job is to do this and then there will be other people that do the translation." And I think that was the model that was taught, but one thing that I've tried to do to have a bigger impact, and I think in this way is of course learn more about implementation science. Think about it from the beginning but also just become comfortable being uncomfortable, because it's such an uncomfortable experience to have to sit and really listen to all the stakeholders and to try to... It's uncomfortable because learning is painful and you have to do new things. And it's also uncomfortable because you wonder if it's going to be a good pay-off in the end.

44:18 TH: And how are you going to get this funded, how's it going to work? And frankly, doing this podcast was really uncomfortable but it met the purpose I had, which was to try to help bridge that gap in some possible way that meets the desire of the people I'm working with. So with students, they're the ones that convinced me to do this, and I'm glad I did, but it was something I hadn't thought of doing. So it's that kind of interaction, that two-way communication that made the biggest difference. But I think that the newer generation of scientists yourselves, Crystle and Rouzana, are able to take this approach and incorporate it from minute one and still hopefully get that currency that you need, right? So you still get the publication, you still get dissemination, you're still getting funding but you're able to do this in a way that hopefully will write for a minute one, not take 17 years to get...

45:10 RK: Yes, for 14%.

45:12 TH: For 14%. Oh, it's even more worse, to get it in the classroom. So I think that's really important. And speaking of new investigators. So what is ASHA doing right now, Crystle? I know you're involved in the committee. Can you tell us about what is ASHA doing to promote implementation science to help reduce the 17 years of 14% of research.

45:32 CA: Yeah. So I am currently a member on an ASHA committee that is affectionately called CRISP, and it stands for Clinical Practice Research Implementation Science and Evidenced-based Practice. And we have a collection of different scientists in the field, as well as some who work clinically to actually come together and discuss the state of our field in terms of evidence-based practice. And so those statistics of 17-year pipeline and only 14% are actually being implemented and changing clinical practice, as well as the fact that how much is being published in ASHA journals that are actually clinical practice research and not basic science or translation science are all questions that we tackle on a regular basis in our meetings and trying to figure out how best we can help support clinicians who are eager to implement evidence-based practices but perhaps are also struggling to figure out what is evidence-based, how do I access this, paywalls are another thing that we always are thinking about.

46:47 CA: And then also how can we support scientists who also want to do this clinical practice research and implementation science, which Clinical Practice Research has been in our field of CSD for a long time, but it hasn't always been as supported as we'd like in early careers. People kind of have always said, "Oh, yes, do treatment research, do intervention work, clinical practice research?" But after tenure, because of that currency that Dr. Hogan was just talking about in terms of, "You gotta get the funding, the publications out first, because intervention work can take such a long time." You just heard, 17 years. So in the traditional sense, that was kind of the framework that people thought about intervention work, and so the implementation science side is now kind of shifting that perspective that it doesn't have to take 17 years, that it can be shortened, that it can be done early career. So Chris really does have a mandate to try to get more PhD students, early career scientists, into clinical practice and implementation science work, so that we can start to close this gap, this research to practice gap, and there are several initiatives that they've done.

48:01 CA: A lot has been... Primarily, if we think about the diffusion of innovation, at the awareness knowledge stage, we've been trying to get the word out. There have been lots of presentations on the ASHA, for example, the ASHA convention every year. We've had some special issues that have been published and supported, if not, wherein by many of the members on the ASHA CRISP committee. And then, we've also now started to support some funding mechanisms for trying to support this kind of work. So the ASHA Foundation recently supported a researcher-clinician collaboration grant. I think they funded three or four of this last year, which is a new mechanism and then the DISTAnce award, which is actually trying to pair current CSD researchers who are interested in implementation science with implementation science researchers from other fields, some very big names who have kind of pioneered this work to go to a conference and essentially write proposals for implementation science work, because there are funding mechanisms for this work in implementation science, just not so much in our CSD field that tends to be in other medical and healthcare professional related mechanisms.

49:13 CA: And so, know that we talk every month [chuckle] about this topic, and how we can help support the association and its members, both clinicians and scientists, because I think we're all in service of trying to help move forward clinical practice, so that we can improve those client and patient and student outcomes. And so, it's just really... It is like Dr. Hogan said about getting... It's trying to kind of, again, find your place on the map for mobilizing change, that was the title of this lecture I gave for many years of you just kind of figuring out where do you fall in this kind of ecology of moving science forward to improve outcomes for our patients and clinicians. And so trying to decide how tolerant are you of risk-taking and uncomfortableness, what's your appetite for conflict and change, how deep are your ideological roots, and how open are you to questioning them. And that's okay. We need all the key... We need all the categories of the doctors and we need all the different key players, and I'm just excited to be at the table to be talking about it, really.

50:24 TH: And I like to... I think what's great about too, and we haven't emphasized it much yet, is that there is this relationship between the research and the clinician, so this is not where your clinicians in a passive role waiting for the researcher.

50:39 CA: Yes.

50:40 TH: And the clinician doesn't... Is, I think, many times can feel like they don't have a voice, and this is the opposite. So the voice is loud and clear, and it's a two-way proposal. So let's say now people are hopefully interested after hearing this podcast and learning more about implementation science, Rouzana, what's out there in terms of initiatives that are existing outside of ASHA, and what are some more resources that people could go to to learn more about implementation science?

51:10 RK: Yeah, there's several resources and initiatives outside of ASHA and those, I think they'll release under the resource list for this episode. So we do have a few of these in the United States. For example, the National Implementation Research Network at North Carolina at Chapel Hill. We also have the Active Implementation Research Network. We have the Colorado Research and Implementation Science Program. There are a lot of resources under the Cancer Institute, and we also have international resources such as the NIH Fogarty International Center. There's a global implementation science initiative that holds a yearly conference where international professionals who work in implementation science come together, and they share opinions and discuss about new directions. There's, I think, also an international journal on implementation science. So we do have plenty of resources, which is a good thing, because we are definitely behind in the field of communication sciences and disorders, but we have a lot of resources outside that we could rely on and bring in our field and adopt them for our purposes.

52:25 TH: And also, I mentioned the LARRC study, it was funded by the Institute of Education Science and their funding mechanism really uses implementation science. So researchers have been funded by the Institute of Education Science. They have to meet the proposal guidelines, include the stakeholders and include them in the process of developing these interventions with the idea of sustainability and NIH, National Institutes of Health has very... Has strong initiatives too for implementation science. So as we wrap up, I wanted to highlight a few final points and one is there's a fantastic article in Journal of Speech, Language, and Hearing Research, JSLHR, by Natalie Douglas and her colleagues. And she offers five ideas that researchers can use knowledge from the field of implementation science to enhance the program of research. And I would say these ideas also apply to clinicians that want to enhance their evidence-based research practices. So number one, is apply implementation science theories, frameworks and models to your research, and I would argue that you would apply those models very early in your research program as you're working towards intervention studies.

53:38 TH: The next one is to use these models to speed up the implementation process, incorporating hybrid research designs right up front, so that you consider the stakeholders and the environment and consider the scalability, as I mentioned, the example for Lark, right up front in your process. And that leads to the third idea, which is to make sure you engage all stakeholders in that research process right up front. So science isn't done in a silo, but it's a two-way street between researchers and clinicians. The next one is connect implementation science agencies and networks, so following up on some of the initiatives and training opportunities and resources that Rouzana mentioned, and then evaluate treatment fidelity in your treatment research. So as a researcher, making sure that you attend to these components of what's... not only how well are clinicians implementing research or the intervention that you created, but why are they not implementing it? If there's low fidelity, why? And can then treatment be adapted to increase fidelity? And then also as a clinician, if you were trying to implement something evidence-based and you're not able to do it, why? What are the barriers and how can that change the way you approach the gap in research to practice?

54:58 RK: I definitely recommend this article by Natalie Douglas and her colleagues, and I do like how she ends the article and emphasizes why implementation science could be beneficial for our field. She talks about identifying research priorities, reducing health care disparities, increasing accountability and quality control. She talks about improving clinician competence and satisfaction, and also of course, improve patient and student outcomes. But what I really like about this article is how she answers her own question. The article's title is "Implementation Science: Buzzword or Game Changer?" And I definitely agree with Dr. Douglas, it is a game changer, it's not a buzzword. And I would say it is a game changer only if it's correctly done because this is a very complex matter. You know more in your well-controlled lab setting, you go into a very complex real-life context where multiple systems are interacting and you're basically disturbing that setting. So we need to be very careful how we approach this, but it definitely has a great potential to improve that research to practice gap.

56:18 TH: Thank you so much for your time, Rouzana and Crystle. Thank you so much.

56:22 CA: Thank you.

56:22 RK: Thank you.

56:26 Tiffany Hogan Check out www.seehearspeakpodcast.com for helpful resources associated with this podcast including, for example, the podcast transcript, research articles, & speakers bios. You can also sign up for email alerts on the website or subscribe to the podcast on apple podcasts or any other listening platform, so you will be the first to hear about new episodes.

Thank you for listening and good luck to you, making the world a better place by helping one child at a time.

Change Agents and Implementation Science with Crystle Alonzo and Rouzana Komesidou
Broadcast by