Many teaching innovations have been developed over the last 20 years, including a number of Research-Based Instructional Strategies (RBIS). However, there is limited research to address how many faculty members are using these strategies, or when they do implement them, whether they are following the theory and steps as intended by the developers. The measure of how well an implemented intervention follows the original is called Fidelity of Implementation. This paper seeks to introduce Fidelity of Implementation to the engineering education community. Using a national survey of 166 statics instructors, we investigated the level of self-reported fidelity of nine RBIS. In this paper, we report overall fidelity values and present in-depth results for 3 specific RBIS: Concept Tests, Collaborative Learning, and Problem-Based Learning. Specifically, we discuss how the use of these RBIS compares to the reported use of classroom activities identified as critical components corresponding to specific RBIS. We used significance tests to determine whether critical components discriminated between users and nonusers. To quantify the fidelity of the different RBIS, the percentage of required critical components implemented in conjunction with the RBIS was examined. Use of all critical components for each RBIS varied from 55-83%. Higher percentages (65-83%) were associated with RBIS that had one required critical component, such as concept tests. For RBIS with higher numbers (3-5) of critical components (such as Problem Based Learning and Collaborative Learning), though the percentage of users with complete fidelity (all critical components) was low (3-66%), the percentage that did not include any components was also low (most with 0% of users having no or only 1 critical component used in the classroom). To highlight the relationships between users and critical components, a Chi Square was completed comparing the RBIS to the different activities. Many of the activities (critical components) were found to have a significant relationship with the reported users of each RBIS, thus discriminating between users and nonusers.
|Original language||English (US)|
|State||Published - Sep 24 2013|
|Event||120th ASEE Annual Conference and Exposition - Atlanta, GA, United States|
Duration: Jun 23 2013 → Jun 26 2013
|Other||120th ASEE Annual Conference and Exposition|
|Period||6/23/13 → 6/26/13|
All Science Journal Classification (ASJC) codes