Impact of physics education research on the teaching of introductory quantitative physics in the United States

During the Fall of 2008 we designed and administered a web survey to collect information about pedagogical knowledge and practices of physics faculty. The survey was completed by a representative sample of 722 physics faculty across the United States a 50.3% response rate . This paper presents results of one part of the survey where faculty were asked to rate their level of knowledge and use of 24 Research-Based Instructional Strategies RBIS that are applicable to an introductory quantitative physics course. Almost all faculty 87.1% indicated familiarity with one or more RBIS and approximately half of faculty 48.1% said that they currently use at least one RBIS. Results also indicate that faculty rarely use RBIS as recommended by the developer, but instead commonly make significant modifications.


I. INTRODUCTION
There have been many calls for the reform of introductory physics courses.In particular, there is concern that many college physics courses: ͑1͒ do not help students develop meaningful understanding of basic physics concepts; 1,2 ͑2͒ do not help students develop the skills necessary to solve real problems in the context of physics; [3][4][5] ͑3͒ turn away many capable students who find these courses dull and unwelcoming; 6,7 and ͑4͒ misrepresent the nature of physics and learning physics. 8,9In response to these concerns ͑and others͒, the last 30 years has seen the development and dissemination of many Research-Based Instructional Strategies ͑RBIS͒.Most of these RBISs have documented improvements in one or more of these areas of concern.
Although substantial time and money has gone into developing these RBIS, little effort has gone into understanding whether typical physics instructors use or even know about these products.In this paper we describe and present the results of a web survey of a national sample of physics faculty designed to document the degree to which Physics Education Research ͑PER͒ has impacted the teaching of introductory physics in the United States.This study was focused on college-level quantitative physics.By quantitative physics we are referring to the algebra-or calculus-based introductory physics classes that often go by the names of "college physics" or "university physics."These have been the target courses for most RBIS and also represent the largest physics enrollments.

II. THEORETICAL PERSPECTIVES AND RESEARCH QUESTIONS
There are many strategies that one might use to promote instructional reform.In the context of college-level physics, however, reform efforts have been dominated by the development and dissemination of specific RBIS.Many of these efforts have been funded by the National Science Foundation's Course, Curriculum, and Laboratory Improvement ͑CCLI͒ program and the United States Department of Education's Fund for the Improvement of Postsecondary Education ͑FIPSE͒ program which encourage such an approach.In this paper we examine the extent of instructional reform in the United States by focusing primarily on the knowledge and use of a set of specific RBIS among a national sample of physics faculty.
Specific research questions addressed in this paper are: ͑1͒ How widespread is faculty knowledge about RBIS? ͑2͒ How widespread is faculty use of RBIS?
͑3͒ To what extent are RBIS modified during implementation?
͑4͒ How frequently is RBIS use discontinued?Why?

III. DATA COLLECTION
In order to answer the research questions we developed and administered a web-based survey to a national sample of physics faculty from three different types of institutions.This section will describe the survey design and administration.

A. Survey design
The web-based survey was developed by the authors in consultation with researchers at the American Institute of Physics Statistical Research Center.We briefly describe the below.The complete survey can be found in the Auxiliary Appendix.
The web survey consisted of 61 questions broken into five sections: ͑1͒ Introductory Screening Questions, ͑2͒ Your Teaching Situation, ͑3͒ Experience with and Attitudes Toward Teaching Innovations, ͑4͒ Your Instructional Goals and Practices, and ͑5͒ About You.Each section is briefly described in the following paragraphs.

Introductory screening questions
The first two questions on the survey verified that the participant met the selection criteria.Faculty were eligible for the survey if they had taught an introductory quantitative course in the last two years and were full-time or permanent employees ͑i.e., faculty who were part-time, temporary employees were not eligible for the survey͒.If these conditions were met, the participant was presented with an informed consent document.If consent was granted, then the survey continued.

Your teaching situation
The six questions in this section asked faculty to identify the algebra-or calculus-based class that they have taught most recently.They were then asked about several class characteristics, such as class size, number of concurrent sections of the same class, aspects of the class that the instructor had control over ͑e.g., laboratory, recitation, testing etc.͒.

Experience with and attitudes toward teaching innovations
This section comprised the majority of the web survey.Participants were asked to rate their level of knowledge about or use of 24 RBIS that are relevant to an introductory quantitative physics course.See the Auxiliary Appendix for a listing of these RBIS and references with more information about each.These RBIS were broken thematically into four groups.After a participant rated their level of knowledge or use for the RBIS in a particular group, up to two RBIS from that group were chosen for follow-up questions.Follow-up questions were asked for one of the RBIS that the participant said they currently used and one RBIS that the participant said they had used in the past.Priority in asking follow-up questions was determined by a ranking of RBIS in order of expected use based on a paper instrument administered to selected participants at the Summer 2008 National AAPT meeting.The goal was to get some additional information about the participant's level of knowledge or use without overwhelming them with a large number of follow-up questions.

Your instructional goals and practices
The five questions in this section asked faculty to rate the importance of three common instructional goals to their introductory course.They were also asked to report on the frequency in which they engaged in a variety of instructional practices during the lecture portion of the course ͑e.g., traditional lecture, instructor solves or discusses qualitative or conceptual problem, whole class voting, etc.͒ as well as how often they used various types of questions on tests and quizzes ͑e.g., well-defined quantitative problems, novel problems, conceptual questions, etc.͒.

About You
The 16 questions in this last section asked about their primary job responsibilities, research productivity, possible sources of information about teaching innovations, as well as various demographic characteristics, such as gender, highest degree, tenure status, and years of teaching experience.

B. Survey administration
The survey was administered in Fall 2008 by the American Institute of Physics Statistical Research Center ͑SRC͒.Sampling was done at three types of institutions: ͑1͒ twoyear colleges, ͑2͒ four-year colleges that offer a physics bachelor's degree as the highest physics degree, and ͑3͒ fouryear colleges that offer a graduate degree in physics.SRC staff randomly selected institutions within each of the three types.Once selected, SRC staff used the website or contact with the department chair to identify faculty who were likely to meet the selection criteria for the survey.Only one graduate-degree offering institution was dropped from the study because insufficient information was available online and the department chair did not respond to emails or phone calls.Full-time, permanent faculty were the primary targets of the survey, however full-time, temporary and part-time, permanent faculty were also included when employment and contact information was available to SRC staff.These participants were sent an introductory email invitation with a link to the survey.Two reminders to nonrespondents were sent, with at least one week between each contact.Table I shows the number of institutions and faculty in the population and sample as well as the number of faculty who responded to the survey.The overall response rate was 50.3%.A response rate above 50% is generally considered adequate 10 and is higher than the average response rates re- ported in the literature for email and web-based surveys. 10,11he number of useable responses is smaller than the actual number of responses since some faculty declined to provide informed consent ͑ϳ17.6% of respondents͒, had not taught an introductory quantitative course in the last two years ͑ϳ9.6% of respondents͒, or were part-time, temporary faculty ͑ϳ5.2% of respondents͒.In these cases the survey ended after the screening questions.

A. Characteristics of respondents and courses
In this section we describe the characteristics of the survey respondents.Additional information about these characteristics can be found in the Auxiliary Appendix.
Type of Institution: Table II shows that the institutional affiliation of respondents is roughly evenly divided between the three types of institutions that we sampled: ͑1͒ two-year colleges, ͑2͒ four-year colleges with a physics B.A. as the highest physics degree, and ͑3͒ four-year colleges that offer a physics graduate degree.
Gender: Overall, 83% of the respondents were male and 17% were female.
Teaching Experience: Respondents had between 0 and 55 years of faculty teaching experience, with an average of 15.4 years.Additionally, when asked how many semesters they had taught an introductory level course 15% had taught between 1 and 4 semesters, 20% had taught between 5 and 10 semesters and 65% reported having taught more than 10 semesters.
Characteristics of Introductory Course: For several parts of the web survey, we asked respondents to think about the last quantitative physics course that they taught ͑or to pick one course if they taught more than one simultaneously͒.Overall, approximately 2/3 ͑62.5%͒ identified a calculusbased course and approximately 1/3 ͑37.5%͒ identified an algebra-based course.The average class size for the course identified was 72 students.As expected, there was a wide range or class sizes both within and between types of institutions.
Faculty Involvement in Courses.When asked about their control over various aspects of the course they were teaching, faculty at all types of institutions reported that they had primary control over lecture ͑98.4%͒,Homework ͑95.7%͒,Testing ͑97.2%͒, and Grading ͑97.6%͒.However, as might be expected, there were differences between types of institutions in terms of the existence of and faculty control over laboratory and recitation or discussion sections as reported in Tables III and IV.
Not surprisingly, faculty at two-year colleges have the most control over all components of their course, while faculty in departments with graduate programs often have control over only some parts of the courses they are involved in.It is also interesting to note that a full 16% of faculty at graduate institutions report that there is no laboratory component to the course.
Perception of Department Support.Fig. 1 shows that most respondents ͑91.8% overall͒ felt that their departments were either somewhat or very encouraging about efforts to improve instruction.Only 1.1% of faculty reported that their departments were somewhat or very discouraging.There is, however, a noticeable difference between the type of institution.Faculty at institutions with a physics bachelor's degree are most likely ͑75.2%͒ to rate their department as "very encouraging" toward teaching improvement.Faculty at insti-   tutions with a graduate degree in physics are less likely to rate their department as 'very encouraging' and more likely to rate their department as "somewhat encouraging."However, still over half ͑54.3%͒ of faculty at graduate institutions rate their departments as very encouraging toward teaching improvement.
Faculty Goals and Perceptions of Success.We asked about three possible instructional goals: problem solving skills, conceptual understanding, and attitudes toward and appreciation of physics.These goals are ones that RBIS commonly seek to address. 15Almost all faculty thought that problem solving skills ͑90.2%͒ and conceptual understanding of physics content ͑91.7%͒ were very important instructional goals.Approximately half ͑50.8%͒ thought that attitudes to-ward and appreciation of physics was a very important instructional goal.As shown in Table V, most instructors are somewhat satisfied with the extent to which their students are meeting the instructional goals that they think are very important.However, few instructors are extremely satisfied and many are somewhat or extremely unsatisfied.

B. How widespread is faculty knowledge about RBIS?
It is reasonable to assume that faculty must know about a RBIS before they can use it.Curriculum developers and other educational researchers frequently give talks and workshops to spread the word about their instructional strategies.Many also publish papers and books about their strategy.Some physics education reformers seek to inform faculty about the wide variety of RBIS that exist for the teaching of undergraduate physics.Examples include the Physics and Astronomy New Faculty Workshop 16 and Redish's book "Teaching Physics with the Physics Suite. 17" In the web survey we asked faculty to rate their level of knowledge and/or use of 24 RBIS ͑See the Auxiliary Appen-dix for web and print references to each͒ that can be used for the teaching of introductory quantitative physics at the college level.Faculty indicated their level of knowledge based on the following scale: I currently use all or part of it ͑current user͒, I have used all or part of it in the past ͑former user͒, I am familiar with it, but have never used it ͑knowledgeable nonuser͒, I've heard the name, but do not know much else about it ͑little knowledge͒, I have never heard of it ͑no knowledge͒.
Percentage of instructors who know about X or more RBIS.Table VI shows the 24 RBIS ranked according to the percentage of faculty who said that they had knowledge about the particular strategy.This includes current users, former users, and knowledgeable nonusers.
Figure 2 shows the percentage of instructors with knowledge of X or more RBIS.Overall, 87.3% of faculty report that they know about 1 or more RBIS and 50.3% report that they know about 6 or more.Faculty from all types of institutions appear to have a good deal of knowledge about RBIS, however, as can be seen from Table VI and Fig. 2, there are some differences by type of institution.In general, faculty knowledge at B.A. institutions is higher than that at two-year colleges or graduate institutions.

C. How widespread is faculty use of RBIS?
Table VII shows the 24 RBIS ranked according to the percentage of faculty who said that they currently use the particular strategy.Figure 3 shows the percentage of instructors who use X or more RBIS.Overall, nearly half ͑48.1%͒ of faculty say that they use 1 or more RBIS, 34.3% say that they use 2 or more, and 22.6% report that they use 3 or more.As can be seen from Table VII and Fig. 3, there are some differences by type of institution.In general, faculty use at B.A. institutions is higher than that at two-year colleges or graduate institutions.
Another way to look at the use data related to specific RBIS is to examine the percentage of faculty who know about a particular RBIS that it ͑see Auxiliary Appendix for full instructor would not be expected to use a RBIS the with the highest percent of users among knowers were: Peer Instruction ͑46%͒, Ranking Tasks ͑40%͒, TIPERS ͑32%͒, and Interactive Lecture Demonstrations ͑31%͒.Notice that these are all strategies which can be incorporated into the traditional lecture format.

D. To what extent are RBIS modified during implementation?
One tentative result of our previous work is that physics faculty, when they do use a RBIS at all, do not typically take a RBIS and implement it "as is." 18 Rather, it appears much more common for faculty to take some of the basic ideas from a RBIS, but not implement the complete RBIS in the way described by the developer.The way faculty use a RBIS has significant implications for dissemination efforts.If faculty want ready-to-use RBIS and materials that they can implement as is, then curriculum developers should develop refined, tested curriculum materials and provide targeted and explicit instructions for how to effectively use them.On the other hand, if faculty want to customize a RBIS before use, curriculum developers should develop materials that are highly flexible and customizable and provide faculty with ideas about a variety of ways to use the materials.As mentioned earlier, there were four clusters of RBIS that we asked about in the web survey.In order to limit respondent fatigue, we decided to limit follow-up questions about RBIS use to two RBIS from each of the clusters: one that is used ͑if any͒ and one that has been abandoned ͑if any͒.The RBIS in each cluster were ranked by our expectation of the percentage of users based on a preliminary paper survey at the Summer 2008 AAPT meeting.If an instructor indicated use of more than one RBIS in a cluster, they would be asked follow-up questions about the use of the one that was highest on the list.Thus, we have information about use all users RBIS each category and very little information from users of other RBIS in each category.It turned out that our predictions of the highest used RBIS on each list were correct.These RBIS are: Peer Instruction, Cooperative Group Problem Solving, Ranking Tasks, and Real-Time Physics Laboratories.
According to Table VIII, there may be differences in the degree of modification for each of the RBIS.Peer Instruction and Cooperative Group Problem Solving both have a significant percentage of users ͑47.9% and 41.0%, respectively͒ who indicate that they have made significant modifications.This is much higher than the percentage of Ranking Task and RealTime Physics Laboratory users who report making significant modifications ͑21.2% and 21.3%, respectively͒.There are many possible reasons for these differences which cannot be resolved with the data available from the survey.For example, one possible reason for the higher degree of modification of Peer Instruction and Cooperative Group Problem Solving is that these are both strategies that, to fully implement, require attention to, coordination of, and probably changes made to many aspects of the course.On the other hand, Ranking Tasks are flexible tools ͑innovative types of problems͒ that can be used within a variety of instructional styles.The developers of Ranking Tasks do not suggest one particular way to use these tools, but instead suggest a variety of ways that they might be incorporated into a course.In the case of RealTime Physics Laboratories, the relatively small percentage of instructors who make significant modifications may be due to the ready-to-use format of the laboratories.
Later in the survey, instructors were asked to describe various aspects of their instruction.Of the 15 aspects of instruction asked about, 5 are particularly relevant to Peer Instruction.Based on the recommendations of the developer, 19 a user of Peer Instruction would be expected to: ͑1͒ Engage in traditional lecture for nearly every class or multiple times every class, ͑2͒ have students discuss ideas in small groups multiple times every class, ͑3͒ have students solve or discuss conceptual problems multiple times every class ͑4͒ have whole class voting multiple times every class, and ͑5͒ use conceptual questions on all tests.Only 6.2% of the selfdescribed Peer Instruction users met all five of these criteria and only 21.1% of met four or five of these criteria.Interestingly, the level of use as described by the developer did not appear to vary based on the level of modification reported by the instructors.For example, 6.3% of respondents who said that they used Peer Instruction basically as described by the developer met all five criteria.This can be compared to 7.3% of respondents who said that they made some relatively minor modifications and 6.3% of respondents who said that they used some of the ideas, but made significant modifications.A table with more details about these data can be found in the Auxiliary Appendix.
Many of these same conclusions can be made for selfdescribed users of Cooperative Group Problem Solving.A table with more details about these data can be found in the Auxiliary Appendix.Few users ͑14.5%͒ use four or five of the developer-described components.Although the table in the appendix appears to show a relationship between extent of modification and use of four of the five components, this apparent trend is likely a result of the small number of respondents represented in that category.
In summary, when faculty report that they are "users" of a particular strategy, it is unlikely that they use the strategy exactly as advocated by the developer.It also appears that they are often not aware they have made modifications.While adaptations are to be expected ͑not every teaching situation is the same͒, there are indications that faculty may modify components which are essential ͑for example, disregarding the peer-peer discussion when using Peer Instruc-tion͒.Thus, care should be taken about drawing conclusions based on the self-reported use of RBIS.

E. How frequently is RBIS use discontinued? Why?
Based on the survey results, RBIS use is frequently discontinued and the amount of discontinuation appears to vary widely by RBIS; from 27.1% ͑TIPERS͒ to 80% ͑Workbook for Introductory Physics͒.We calculated the rate of discontinuance of a RBIS by dividing the number of former users by the number of former plus current users.Details on discontinuation by RBIS and type of institution are provided in the Auxiliary Appendix.
In this section we focus on the four RBIS that were the top priority for follow-up questions since these are the only ones offering substantial information about discontinuance.Table IX shows these four RBIS along with the rate of discontinuation and the length of use.This shows that it is relatively uncommon for faculty to discontinue use of these strategies before using them for at least one semester.This is important since it likely gives faculty the opportunity to overcome any implementation dip 20,21 that might occur when they first begin to use a new strategy.
Faculty gave a variety of reasons for discontinuing use of RBIS.Table X summarizes the free response reasons that faculty gave for discontinuing use of the four RBIS that were a priority for follow-up.Details for each of the four RBIS individually are in the Auxiliary Appendix.
Approximately one-third of faculty discontinuing use indicated problems with the RBIS.They either said that the RBIS did not work ͑19.8%͒ or that it took too much class time ͑12.3%͒.Approximately one-fifth of faculty indicated that, although they are not currently using the specific RBIS as described by the developers, they are continuing to work on instructional improvements.They either said that they have significantly modified the particular RBIS ͑13.7%͒ or that they are currently trying other instructional styles ͑9.0%͒.Other faculty indicated that they diminished use to only occasionally ͑9.0%͒, that they were currently teaching a course for which the RBIS was not appropriate ͑8.5%͒, or that their department did not support use of the RBIS ͑4.7%͒.There were also a number of responses that did not cluster in one of the categories above ͑23.1%͒.

V. CONCLUSIONS
This study is the first attempt to document faculty knowledge about and use of RBIS relevant to the teaching of introductory quantitative physics.Results indicate that the development and dissemination efforts by physics education reformers have made an impact on the knowledge and practice of many faculty, but that there is significant room for improvement.The main findings reported here are: ͑1͒ Faculty knowledge and attempted use of RBIS appears to be relatively widespread.Almost all faculty ͑87.1%͒ are familiar with one or more RBIS and approximately half of faculty indicate that they are familiar with six or more RBIS.In addition, approximately half of faculty ͑48.1%͒ say that they currently use at least one RBIS.
͑2͒ RBIS are commonly modified during implementation.Rarely are they used as recommended by the developer.With the current survey data we are not able to estimate the likely impact ͑either positive or negative͒ of these modifications.The high level of modification, though, suggests that this is an important area for further study.
͑3͒ Faculty often try strategies then discontinue their use.At least 20% of the time the discontinuance may be positive-either due to significant modification of the RBIS ͑presumably to suit their situation better͒ or development or adaptation of a different instructional strategy.In many cases, though, discontinuance results because faculty felt that the RBIS did not work or did not fit well with their teaching situation.

VI. IMPLICATIONS
The results of this study suggest both good news and bad news for those who advocate the use of RBIS in introductory physics.The good news is that many faculty are aware of these RBIS and willing to try them.The bad news is that RBIS may be implemented inappropriately and/or discontinued.The common dissemination strategies used within PER ͑e.g., giving talks and workshops, publishing books͒ have been successful in promoting interest and motivating faculty to try RBIS.This is what Rogers 22 calls "awareness" knowledge.However, these common dissemination methods do not seem to be successful at helping faculty understand the essential features of the innovation ͑what Rogers calls "how to" knowledge͒.The high level of discontinuance suggests that faculty do not have the necessary knowledge to customize the RBIS to their situation ͑what Rogers calls "principles" knowledge͒ or that they underestimate the situational factors that tend to work against the use of innovative instructional strategies. 23Thus, it is important to investigate ways to better support faculty use of RBIS once they have developed awareness and interest.This is likely to include ways to provide substantial support and guidance to faculty during the implementation and customization process as well as ways to provide flexible curricula that can be easily customized without losing the essential features.

FIG. 3 .
FIG. 3. Percentage of instructors who use X or more RBIS.

TABLE I .
[12][13][14]population, sampled survey participants, and web survey response rates for faculty in each type of institution.Population estimates are from reports by the AIP Statistical Research Center.[12][13][14]

TABLE II .
Institutional Affiliation of Respondents.

TABLE IV .
Existence of and control over recitation or discussion section component of the course.

TABLE V .
Satisfaction with extent to which students are meeting their instructional goals.This table only includes those respondents who rated the particular goal as very important.

TABLE VI .
Ranking of the 24 RBIS according to level of Knowledge ͑percentage of faculty who indicate that they are familiar with or have used the RBIS͒.

TABLE VII .
Ranking of the 24 RBIS according to percentage of current users.

TABLE VIII .
Extent of modification identified by self-reported users of all or part of each of four RBIS.The percentages listed are the percentage of users within each of the RBIS categories who answered the question.

TABLE IX .
Ranking of 4 RBIS according to percentage of faculty who have discontinued use as well as length of use ͑prior to discontinuation͒.Discontinuance of a RBIS is calculated by dividing the number of former users by the number of former plus current users.

TABLE X .
Categorization of open-ended responses ͑N = 212͒ to the question "Why do you no longer use ͓discontinued RBIS͔?"Based on reasons given for four RBIS that had highest priority for follow-up questions.