Synthesis of Discipline-Based Education Research in Physics
Synthesis of Discipline-Based Education Research in Physics
Synthesis of Discipline-Based Education Research in Physics
I. INTRODUCTION
This paper synthesizes physics education research (PER)
at the undergraduate level, and is based on a paper that was
commissioned by the National Research Council to inform
a study on the status, contributions, and future directions
of discipline-based education research (DBER)a comprehensive examination of the research on learning and
teaching in physics and astronomy, the biological sciences,
chemistry, engineering, and the geosciences at the undergraduate level [1,2]. PER is a relatively new field that is
about 40 years old, yet it is relatively more mature than its
sister fields in biology, chemistry, engineering, astronomy,
and geosciences education research. Although much is
known about physics teaching and learning, much remains
to be learned. This paper discusses some of what the PER
field has come to understand about learners, learning, and
instruction in six general topical areas described herein.
A. Topical areas covered and organization
Given the breadth and scope of PER to date, we organize
this synthesis around six topical areas that capture most of
the past research in physics education: conceptual understanding, problem solving, curriculum and instruction,
assessment, cognitive psychology, and attitudes and beliefs
about learning and teaching. To ensure consistency in the
presentation and to aid the DBER committee in its charge,
each of the six topical areas is organized under the
*
1554-9178=14=10(2)=020119(58)
following sections: research questions; theoretical framework; methodology, data collection or sources and data
analysis; findings; strengths and limitations; areas for
future studies; references. In this paper, the final section
on continuing and future directions of physics education
research has been removed and placed in the Supplemental
Material [3]. In addition, the references have been compiled
at the end of the paper rather than individually by section.
Because of the cross-cutting nature of some articles, some
that were included in a particular section could have just as
easily been included in another section; we highlight the
specific features of articles as they pertain to the sections
emphasis. Although we did not place any restrictions on the
dates of the research studies covered, the great majority of
the studies cited are within the past 20 years and were done
in the United States. The original commissioned paper
included published studies up through October of 2010,
and this revised paper includes studies published through
May of 2013. In addition to the six topical areas, a
summary and our conclusions are presented in Sec. VIII.
Equally important to stating what this paper is covering
is stating what has been left out. The commissioned paper
had a specific focus on empirical research on undergraduate teaching and learning in the sciences as outlined by the
criteria set by the National Academies [1,2]. Therefore, the
following areas of research have not been included in this
review: precollege physics education research (e.g.,
research on high school physics teaching and learning)
and research related to physics teacher preparation or
physics teacher curricula. We also made a decision to
exclude how to articles describing research analyses
(e.g., ways of analyzing video interviews of students)
which are pertinent for the community of physics education
researchers but do not have a direct impact on undergraduate physics education. The coverage herein is extensive,
020119-1
020119-2
SYNTHESIS OF DISCIPLINE-BASED
020119-3
1. Contexts
Research on identifying and documenting misconceptions is done either by administering assessments designed
to probe students views in contexts where concepts need to
be applied or discussed or by conducting clinical interviews
with students about a context in which persistent conceptual errors seem to be prevalent. Clinical interviews has
been the methodology most used in probing students
conceptual architecture in memory; during clinical interviews, which typically last 1 hour, a researcher probes a
students conceptual understanding in a target topic through
a series of interviewer-led questionsquestions that often
are open ended and guided by the students responses to
previous questions. Because interview studies result in
020119-4
SYNTHESIS OF DISCIPLINE-BASED
2. Participants
Misconceptions research in STEM has been conducted
with students of all ages and in various contexts such as
large undergraduate introductory courses, as well as high
school physics courses. Although not discussed in this
synthesis, misconceptions on physical science concepts
have also been studied with middle school and elementary
school children. Recently, there have been an increasing
number of studies exploring conceptual understanding
among upper-division physics students as well as graduate
students (see, e.g., Refs. [55,56]).
3. Data sources and analysis
Data sources for misconception studies come from
students performance in assessment questions or from
transcripts of clinical interviews. In cases where an intervention or teaching approach is being evaluated, assessments of misconceptions are administered to students
prior to, and following, the intervention and differences
between the postscores and prescores are analyzed. Data
from interview studies are analyzed or interpreted using
grounded theory [57], defined as developing a theory by
observing the behavior of a group (in this case, students) in
concert with the researchers insights and experiences; any
theory emerges from data as opposed to formulating
hypotheses which are then tested.
D. Findings
Discussing research findings in conceptual understanding can be a controversial issue since, as discussed above,
there are three theoretical perspectives describing the
nature of concepts in memory and conceptual change.
Misconceptions and ways of overcoming them are central
themes according to the misconceptions theoretical view,
whereas the knowledge in pieces or resources view is not in
agreement with the robustness of misconceptions or with
how one might go about overcoming them; proponents of
this view would argue that, as typically defined, students do
not possess misconceptions, but rather compile knowledge
pieces on the spot to reason about phenomena, and thus the
misconceptions that emerge for us to observe are highly
context dependent. There are also distinct differences
020119-5
1. Misconceptions
Students possess misconceptions that are deeply
rooted and difficult to dislodge [34,35]. An abundance
of misconceptions have been identified across a wide
range of physics topics in undergraduate physics (for a
good review prior to 2000, see Ref. [10]). Often,
misconceptions seemingly disappear and are replaced
with scientific concepts following instruction only to
reappear months later.
Several intervention strategies, largely based on the
conceptual change framework of Strike and Posner
[38], have proven effective at helping students overcome misconceptions. Some strategies, such as one
that has been successfully used by the University of
Washington Physics Education Research group for
many years, are based on a cyclic process that begins
by identifying misconceptions in a physics topic, then
designing interventions based on previous research,
instructor experiences, or best guesses, then piloting
the intervention and evaluating its success with posttests that measure transfer to related situations, then
refining the intervention over other cycles until
evidence is obtained that the intervention works. It
should be pointed out that even the most successful
groups at this type of work (e.g., McDermott, Herron,
and Shafer at the University of Washington) will freely
admit that devising effective instructional strategies to
combat misconceptions is often a slow, painstaking
task requiring multiple triesnot unlike engineering a
solution to a complex problem. Other approaches,
such as Clements bridging analogies [26,58,59],
start with anchoring intuitions, which are strong and
correct understandings that students possess, and
devising interventions that attempt to bridge from
students correct intuitions to related contexts in
which students display misconceptions. Yet another
very effective approach [29] uses interactive lecture
demonstrations to help students overcome prevalent
misconceptions involving Newtons third law (i.e., the
belief that heavy or fast moving objects exert larger
forces on light or stationary objects when the two
interact via collisions) by displaying in real time the
force exerted by mutually interacting carts during a
collision under different conditions (e.g., heavy cart
colliding with a light cart or moving cart colliding
with a stationary cart, etc.). After experiencing the
interactive lecture demonstrations intervention,
students retain appropriate understanding of Newtons
third law months afterwards.
Despite the previous bullet point, designing an effective intervention to help students overcome a particular misconception can be elusive [43,60].
2. Knowledge in pieces or resources
Students possess knowledge pieces, also referred to as
phenomenological primitives (or p-prims), or resources, that are marshaled to reason about physics
contexts. These pieces of knowledge can be recalled
singly or in groups and compiled in different ways in
real time in response to different contexts. It is
common to observe that contexts that are considered
equivalent, perhaps even nearly identical by experts,
are viewed as different by students and different
knowledge pieces are brought to bear to reason about
them. Students knowledge is more dynamic in the
pieces or resources view than in the misconceptions
view. As expertise is developed with time and experience, there is refinement of the knowledge in
memory whereby knowledge pieces that repeatedly
prove effective to recall and apply to particular
contexts are compiled into scientific concepts. That
is, repeated rehearsal of knowledge pieces recalled
and compiled to deal with similar situations leads to
the formation of stable scientific concepts in experts.
3. Ontological categories
Misconceptions stem from students categorizing scientific ideas into inappropriate categories [51,53]. For
example, processes are placed in the things category
(e.g., electric current is thought of as fuel that is
used up in light bulbs).
Misconceptions are relatively easy to fix when they
involve modification within the same category, but are
difficult to fix when they involve modifications across
categories [23,61].
Instructional strategies designed to help students
overcome misconceptions by recategorizing into appropriate ontological categories have shown promise
(see, e.g., Refs. [28,6269]).
2. Limitations
Designing definitive experiments that falsify one of
the theoretical views remains elusive, hence the debate
among proponents of the three theoretical views
continues.
Although many misconceptions have been cataloged
across both introductory and advanced physics topics,
it is daunting to ever achieve a complete list.
F. Areas for future study
Research on identifying and documenting misconceptions has been progressing for several decades and has
covered an extensive range of physics topics [10], so future
research in this area is limited to alternate populations
(e.g., upper-division students, see [70]) and yet-to-beinvestigated topics. Opportunities for continued research
on the nature of student thinking and reasoning exist,
including how students ideas progress over time. There is a
pressing need for studies to help articulate general instructional strategies for guiding students to adopt scientific
conceptions, especially when those conflict with students
existing conceptions. Another promising area for future
research is to design experiments to test the three competing viewpoints outlined in the theoretical framework in
order to arrive at a more unified view.
III. PROBLEM SOLVING
020119-6
SYNTHESIS OF DISCIPLINE-BASED
A. Research questions
The research in problem solving include the following
categories and questions.
1. Expert-novice research
What approaches do students use to solve physics
problems? How are the problem-solving procedures used
by inexperienced problem solvers similar to and different
from those used by experienced solvers? How do experts
and novices judge whether problems would be solved
similarly? Early studies of physics problem solving investigated how beginning students solve physics problems and
how their approaches compare to experienced solvers, such
as professors [7783]. This area of research also includes
categorization studies to infer how physics knowledge is
structured in memory [8486].
2. Worked examples
How do students study worked-out examples? How do
students use solutions from previously solved problems
when solving new problems? How do students use instructor solutions to find mistakes in their problem solutions to
homework and exams? What features of worked-out
problem solutions facilitate student understanding of the
example? This body of research explores how students use
worked-out problem solutions or previously solved problems to solve new unfamiliar problems [87,88]. It also
includes how students use instructor solutions to selfdiagnose errors in their own problem solutions [89,90],
and how to design effective examples [91].
3. Representations
What representations do students construct during
problem solving? How are representations used by students? What is the relationship between facility with
representations and problem-solving performance? What
instructional strategies promote students use of representations? How do students frame problem-solving tasks?
This research explores the use of external representations
for describing information during problem solving, such as
pictures, physics-specific descriptions (e.g., free-body diagrams, field line diagrams, or energy bar charts), concept
maps, graphs, and equations. Some studies focus on what
representations are constructed during problem solving and
the manner in which they are used [9296], whereas other
studies explore the facility with which students or experts
can translate across multiple representations [9799]. How
students frame the problem-solving task impacts problem
4. Mathematics in physics
How are the mathematical skills used in physics courses
different from the mathematical skills taught in math
courses? How do students interpret and use symbols
during quantitative problem solving? This area of research
explores how quantitative tools from mathematics courses
are applied during physics problem solving. Some examples include the use of symbols in equations to represent
physical quantities [103106], vector addition [107], arithmetic, algebra, geometry, calculus (e.g., integration) [108],
and proportional reasoning [109].
5. Evaluating the effectiveness of instructional
strategies for teaching problem solving
How does instructional strategy X [e.g., cooperative
group problem solving] affect students problem solving
skills? To what extent do conceptual approaches to
problem solving influence students conceptual understanding of physics? How do students interact with online
computer tutor systems? What features of Web-based
homework systems successfully enhance students problem
solving skills? Several instructional strategies have been
developed and tested, including the use of alternate types
of problems [110114], adopting an explicit problemsolving framework (i.e., a consistent sequence of problem
solving steps) [115], conceptual approaches [116118],
cooperative group problem solving [119], and computer
homework or tutor systems to help students become better
problem solvers [71,120,121].
B. Theoretical frameworks
This section identifies a few prominent theories about
learning and problem solving from cognitive science and
educational psychology. Although these frameworks are
viewed as useful and relevant for PER, problem-solving
researchers in PER often do not clearly define or draw upon
a theoretical basis for their research studies. The frameworks reviewed here are intended to provide a starting point
for discussion. The following frameworks are included:
information-processing models, problem solving by analogy, resources model, and situated cognition.
1. Information-processing models
According to one theory of human problem solving
[122124], problem solving is an iterative process of
representation and search for a solution. This theory defines
a mental state called the problem space consisting of a
persons available knowledge and their internal representation or understanding of the task environment, including
the initial state (given information), goal state or target,
and appropriate operations that can be employed. After
020119-7
020119-8
SYNTHESIS OF DISCIPLINE-BASED
2. Participants
Studies of physics problem solving frequently involve
undergraduate students in large-scale introductory courses
(algebra based or calculus based) at a single institution over
one or two semesters, sometimes longer. Studies conducted
with high school students or disciplinary majors are less
common. The experts in expert-novice studies are often
physics faculty or graduate teaching assistants (TAs) who
are experienced with teaching introductory courses.
3. Data sources and analysis
One common data source for problem-solving research
is students written solutions to physics problems that have
been designed by the researcher(s) and/or adapted from
existing problem sources. In most cases the problems are
free response, but occasionally they are in a multiple-choice
format. These data are typically analyzed by scoring the
solutions according to particular criteria, as determined by
comparing the written solutions to features of an ideal
instructor solution. Sometimes rubrics are used to make
scoring criteria objective and improve agreement among
multiple scorers. Students written solutions are used in
curricular intervention studies to compare performance in
reformed courses to traditional courses on a set of common
assessment items.
Another widespread data source, especially for cognitive
studies, is think-aloud problem-solving interviews. To
collect such data participants are asked to solve physics
problems and verbalize their thoughts during the process
while being video- and/or audiotaped. Transcripts from
these interviews are analyzed using standard qualitative
methods such as case studies or grounded theory [57] to
elucidate common themes or problem-solving approaches
from the statements.
Studies comparing expert and novice problem solvers
have traditionally used categorization tasks that require
participants to group problems based on similarity of
solution. Early studies used card-sorting tasks where
subjects physically placed problem statement cards into
piles and the resulting category was assigned a name
[84,138]. More recent studies have used alternate categorization tasks, such as selecting which of two problems
would be solved most like a model problem [85], multiplechoice categorization formats [117], or ranking the
similarity of two problems on a numerical scale [139].
Some categorization tasks are analyzed using qualitative
methods (to elicit commonalities across card groupings),
whereas others compare quantitative ranking or the frequency of similarity judgment responses for different
groups of subjects. New approaches are emerging for
interpreting card-sorting categorization data [140].
Eye-tracking technology is a rare data source, but in
some cases it is used to study gaze patterns or fixation time
upon particular features of problems or problem solutions
presented on a screen [141143]. These problem features
020119-9
2. Worked examples
A common practice in problem-solving instruction is to
provide students with worked-out example problems that
demonstrate solution steps. This section briefly reviews
research on how students study example problem solutions
in textbooks, how they refer to previous examples during
problem solving, and how they use instructor solutions to
detect and correct errors in their own solutions. This
research also addresses how to design example solutions
to optimize their usefulness.
Using examples. Research conducted by Chi et al.
[87] and Ferguson-Hessler and de Jong [153] on how
students study worked-out example solutions in textbooks concluded that good and poor students used
worked examples differently when solving a new
problem. In Chi et al. [87], the labels good and
poor were determined by a students success at
solving problems after studying the worked examples
(12 isomorphic or similar problems and 7 unrelated
problems from a textbook). Good students (who
scored an average of 82% on problems) referred to
a specific line or line(s) in the example to check
procedural aspects of their solution, whereas poor
students (who only scored an average of 46% on
problems) reread the example from the beginning to
search for declarative information and a solution
procedure they could copy [87]. Ferguson-Hessler
and de Jong [153] confirmed these findings and
described the actions of good students as deep
processing of the examples whereas poor students
engaged in superficial processing.
A study by Smith, Mestre, and Ross [143] used eye
tracking to investigate what aspects of a problem solution
students look at while studying worked-out examples. They
found that although students spent a large fraction of time
reading conceptual, textual information in the solution,
their ability to recall this information later was poor. The
students eye-gaze patterns also indicated they frequently
jumped between the text and mathematics in an attempt to
integrate these two sources of information.
Self-diagnosis. When students receive feedback on
their performance on homework and exams, instructors generally expect students to reflect on and learn
from their mistakes. However, research indicates that
such self-diagnosis of errors is difficult for students. In
a study by Cohen et al. [89], only one-third to one-half
of students were able to recognize when they used an
inappropriate physics principle for a problem. These
results were affected by the support provided to the
students, indicating that having access to a correct
solution and self-diagnosis rubric helped students
explain the nature of their mistakes better than an
instructor solution alone or only access to the textbook
and class notes [90]. A classroom intervention by
Henderson and Harper [154] found that requiring
020119-10
SYNTHESIS OF DISCIPLINE-BASED
020119-11
020119-12
SYNTHESIS OF DISCIPLINE-BASED
020119-13
020119-14
SYNTHESIS OF DISCIPLINE-BASED
D. Findings
1. Contexts
1. Lecture-based methods
020119-15
020119-16
SYNTHESIS OF DISCIPLINE-BASED
3. Laboratory methods
These approaches seek to revise traditional labs, which
are often described as confirmatory or cookbook labs in
which students follow step-by-step instructions to verify a
result. General approaches to making labs more interactive
include engaging students in class discussions before and
at the end of class; revising materials such that students
must plan and make decisions; and probing for student
020119-17
020119-18
SYNTHESIS OF DISCIPLINE-BASED
Physics, the lecture and laboratory sessions are combined into a single course session of 3045 students
meeting for two hours, 24 times per week [242] with
computer-based activities and collaborative group
work. Early implementations of Studio Physics
showed low gains on the Force Concept Inventory
exam (similar to scores from traditional courses), but
the introduction of research-based methods such as
interactive lecture demonstrations and cooperative
group problem solving significantly improved learning gains [242]. Hoellwarth, Moelter, and Knight
[243] found that students in a studio section had
significantly higher normalized gain on the FCI and
the FMCE than students in a traditional classroom;
however, students scores on quantitative final exam
problems were the same or slightly worse in the studio
sections.
In another instantiation of studio, called New Studio
Physics, the lecture remains separate but the recitation and
laboratory sessions are combined into a 2-hour session, two
times per week [245]. Implementation of New Studio
Physics found gains on the Force Concept Inventory exam
similar to those obtained by other interactive engagement
instructional methods.
SCALE-UP. The acronym SCALE-UP stands for
student-centered active learning environment for
undergraduate programs, which was first developed
and studied at North Carolina State University in
physics classes and has been adopted by several
other programs and institutions [247,332]. SCALEUP uses a very carefully designed classroom in which
students (typically 50100) work at round tables in
teams (typically 3 teams of 3 students at each table)
with networked laptops and access to whiteboards.
There are multiple display screens around the room.
Students engage in hands-on activities including viewing interactive computer simulations, responding to
questions or problems, and conducting hypothesisdriven laboratory experiments. Research on SCALEUP cites improved problem-solving scores on course
exams, conceptual understanding gains as measured by
pre-post concept inventories, slightly more positive
attitudes, and reduced attrition (see How do you know
it works? in Ref. [333], and see also [247,334]).
TEAL. One extension of Studio-SCALE-UP is the
Technology-Enabled Active Learning (TEAL)
project at the Massachusetts Institute of Technology
[248,250]. TEAL uses the classroom design of
SCALE-UP, which combines lecture, recitation, and
laboratories into a single media-enhanced classroom.
Like SCALE-UP, it incorporates Web-based homework assignments, conceptual questions with clickers,
and two- and three-dimensional visualization tools for
learning electromagnetism concepts [250]. Studies
of TEAL indicate higher learning gains on pre- and
020119-19
designed projects could encourage students use of qualitative and analytic reasoning skills. Another instantiation of
computer programming in introductory physics is the use
of VPython in the Matter & Interactions curriculum [256].
In addition to teaching students some basic programming
and debugging skills, the computational tools also allow
students to visualize abstract phenomena and tackle
problems that cannot be solved analytically. For a more
comprehensive review of research on integrating computational physics into the undergraduate curriculum, see
Ref. [351].
Homework. There are few studies related to effective
procedures for the assignment, collection, and grading of
homework in large enrollment courses. One study found
that allowing students some choice in which problems to
solve and making a fraction of problem solutions available
before homework is due provided more timely feedback to
students, and resulted in an increase in self-direction [352].
Preservice and in-service teachers. As mentioned
previously, the Physics by Inquiry laboratory materials
are appropriate for educating preservice and in-service
K-12 teachers about physical science [238,239]. A
different curriculum targeted toward elementary
teachers is Physics and Everyday Thinking (PET)
[353]. PET has shown a shift toward expertlike
attitudes as measured by the Colorado Learning
Attitudes about Science Survey [354].
High school curricula. Although not reviewed here, it
is worth mentioning that there have been several texts
and workbooks developed for use in middle school
physical science and high school physics, which are
sometimes adapted for undergraduate courses with
nonscience majors. Examples include Modeling
Instruction [258,260], Minds on Physics [259], Its
About Time: Active Physics [355], Tools for Scientific
Thinking [239], and a curriculum for 7th-8th graders
called InterActions in Physical Science [356].
Upper-division curricula. One curriculum that reforms the junior-level and senior-level courses for
physics majors is the Paradigms in Physics project at
Oregon State University [268,357]. It includes revisions to the structure or order of topics, content of
courses, and instructional methods, in an effort to
better reflect physicists views and experiences in the
field. Examples include adding laboratory experiments as a component in several courses, making a
stronger link between mathematics and physics, and
introducing computational examples and problems
[357]. Evaluation of the program indicates improved
retention of physics majors and improved physics
problem-solving abilities [268]. The development
and evaluation of UW Tutorials curricula for upperdivision mechanics courses also shows promising
effects on learning advanced topics [358]. The
University of Colorado-Boulder has transformed their
020119-20
SYNTHESIS OF DISCIPLINE-BASED
2. Limitations
This review identified some instructional strategies and
curricula that have not been extensively tested for their
effectiveness in physics, such as the Just-in-Time teaching
method and several laboratory-based curricula. In addition,
although some textbooks are PER based, many of the most
commonly used texts for undergraduate courses are not,
and this can create a mismatch between the goals of
reformed instruction and the materials actually used in
the course; to date, a large percentage of college physics
courses are inconsistent with evidence-based reforms
[368,369].
F. Areas for future study
Upper-division undergraduate courses and graduate
education. An area of future study includes expanding
evidence-based instructional strategies and materials
that have been developed for introductory undergraduate physics courses to other populations, such
as upper-division (junior- and senior-level) courses
for physics majors and courses for graduate students.
Although there have been some studies of curriculum and instruction in upper-division courses
[268,362,364,370], this area would benefit from additional research.
High schooluniversity cross-curricular studies.
Although several of the physics curricula are identified as appropriate for a particular student population,
it is possible that some curricula for high school might
be appropriate for university nonscience majors (see
Ref. [260]) and that university curricula might be
appropriate for advanced high school courses. The
application of instructional strategies and materials to
alternate groups of students is a possible area for
future research.
Courses for biology and premedicine students. There
is some current interest in reforming the introductory
course for biology and premedicine students [371].
Additional discussion is needed to determine how
current courses are (or are not) meeting the needs of
this student population, and research is needed to
develop and test reformed instruction and curricula. At
least one research group in Germany has developed
and tested the effectiveness of a reformed physics
laboratory curriculum for medical students [372].
They found that including medical applications
(e.g., connecting electricity and neural functions)
improved students attitudes towards physics and
improved their ability to relate concepts of physics
and medicine as measured by the construction of
concept maps.
020119-21
020119-22
SYNTHESIS OF DISCIPLINE-BASED
020119-23
2. Participants
Studies of physics assessments frequently involve
undergraduate students in large-scale introductory courses
(algebra based or calculus based) at a single institution over
one or two semesters, sometimes longer. Studies conducted
with high school students or disciplinary majors are less
common. Very few meta-analyses have been conducted to
compare concept inventory scores across multiple institutions. One such analysis is Hake [209], who examined
Force Concept Inventory scores for 62 mechanics courses,
comparing interactive-engagement instructional strategies
to traditional instructional methods.
3. Data sources and analysis
The early development stages of a concept inventory
typically include interviewing students or reviewing
existing research literature to identify misconceptions
and/or propose assessment items, and pilot testing these
items in both a free-response format and a multiple-choice
format. After an assessment has been drafted, the statistical
analysis of scores (often to measure the validity and
reliability of scores on items) typically involves numerical
data and quantitative data analysis. Studies comparing
performance on multiple measures or across student populations typically utilize statistical measures, such as
correlation coefficients.
D. Findings
1. Development and validation of concept inventories
Concept inventories used in physics. As stated previously, more than 30 physics concept inventories or attitude
surveys have been written [377]. Many are still under
development and undergoing validation procedures, and
about half of them have been published. The inventories
with accessible references are listed below, in chronological
order of their publication date.
Published concept inventories and validation studies:
MDT: Mechanics Diagnostic Test [376]
MBT: Mechanics Baseline Test [14]
FCI: Force Concept Inventory [15]. In the initial
article published about the FCI, they identified six
conceptual dimensions included on the test (kinematics, Newtons three laws, the superposition principle,
and kinds of force). These came into question when
factor analysis could not statistically confirm the
020119-24
SYNTHESIS OF DISCIPLINE-BASED
020119-25
020119-26
SYNTHESIS OF DISCIPLINE-BASED
020119-27
020119-28
SYNTHESIS OF DISCIPLINE-BASED
Learning progressions. A few researchers have investigated the progression of learning throughout an
introductory electricity and magnetism course, looking at the peaks of high scores and decays or
decrease in performance on conceptual questions that
occur between the administrations of pre- and posttests [401,402]. They observed a rapid increase in
performance that occurred after homework assignments (not after lectures or laboratories) but decreased
a few days later. The researchers also observed
interference effects, where learning about electrical
potential impeded students performance on questions
related to the vector nature of electric fields, but
performance recovered later when learning about
magnetic fields.
E. Strengths and limitations of assessment research
1. Strengths
The topic of assessment using multiple-choice inventories lends itself to large-scale, quantitative studies.
Some of the research studies or meta-analyses cited in
this section include test scores for upwards of hundreds or even thousands of students, which provide
confidence in the interpretation of results.
The widespread availability of concept inventories in
physics has facilitated their use in classrooms both
nationally and internationally. Instructors have several
tools they can use to evaluate the effectiveness of their
instruction, and can compare published results from
other institutions to the scores of their students.
From the perspective of making physics instruction
more effective, concept inventories historically have
served a pivotal role in catalyzing dialogs among
physics professors who were initially very skeptical
that students (especially the ones they taught) could
hold erroneous conceptual notions on fundamental
physics ideas following instruction.
2. Limitations
Many studies who report data on inventories do
not give sufficient information about when they
were administered and incentive conditions, which
can have a significant impact on results (see
Ref. [382]).
It is typical for studies to report average scores for a
particular class or group of students, but many do not
indicate the distribution of scores (e.g., histograms) or
separate the results for underrepresented minorities.
Some research on conceptual tests, especially the
Force Concept Inventory, have produced contradictory
results. Although the test authors claim the questions
and choices were developed based on students openended responses to questions, subsequent studies have
found that changing the format of test questions or the
020119-29
020119-30
SYNTHESIS OF DISCIPLINE-BASED
020119-31
1. Contexts
Cognitive studies in physics education are difficult to
conduct on a large scale, so they typically take place in a
small controlled environment outside of the classroom.
2. Participants
Participants in cognitive research studies are typically
students who know some physics, often paid volunteers
who have taken one or two introductory courses in physics.
The experts in expert-novice studies are often physics
faculty or graduate teaching assistants who are experienced
with teaching introductory courses.
3. Data sources and analysis
Research on cognitive processes often requires descriptive, qualitative sources of data, such as interviews or selfexplanations, to infer students reasoning processes as they
engage in a task. Studies on attention might employ
sophisticated data collection tools such as eye-tracking
devices to record gaze patterns. Studies of problem
categorization can utilize a variety of task formats, as
described below.
Verbal reports from clinical interviews. One common
methodology for studying cognitive processes in
physics education research, and many other disciplines as well, is to conduct interviews of subjects
engaged in cognitive tasks, the hope being that verbal
utterances reflect, at least in part, what the mind is
doing [466468]. There are differing views on what
can be ascertained from subjects verbal reports. One
view is that subjects should simply be instructed to
think aloud as they perform a task, since it is
believed that doing so does not change the sequence
of thoughts that would occur if the subject were to
perform the task in silence [453]. Doing more, such as
asking subjects why they make particular choices in
performing a task [469] or asking subjects to selfexplain while reading and attempting to understand a
solution to a problem [470], may change the accuracy
of observed performance in studies of expertise [467].
Others argue that cognitive clinical interviews are
derivative of naturally occurring forms of interaction
and are thus ecologically valid [466], or that the
quality of the information obtained is partly dependent
on students perception of the nature of the discourse
interaction [121]. In PER, interview techniques are
often used that attempt to scaffold or cue students to
previously learned knowledge (see Ref. [450] for
examples and Refs. [5,471]); these interview techniques are useful for learning about ways of structuring effective instructional strategies, as in design
research [472,473].
Self-explanations. Asking subjects to explain to themselves out loud the important features of a problem
020119-32
SYNTHESIS OF DISCIPLINE-BASED
020119-33
020119-34
SYNTHESIS OF DISCIPLINE-BASED
A. Research questions
4. Teaching Assistants
020119-35
B. Theoretical frameworks
Research studies of attitudes and beliefs about teaching
and learning do not always explicitly identify theoretical
frameworks. Studies might simply cite previous research
findings from education research or psychology as a basis
for their work. Some of the PER research on faculty beliefs
takes this approach (see, for example, Ref. [511]). One
exception is research on teaching assistants [519,520],
which identifies framing theory from anthropology and
linguistics as a basis for understanding how TAs make
sense of their teaching experiences. The following examples are frameworks that have been developed and published by researchers in physics education.
1. Student attitudes and beliefs
Hammer [501,522] developed a framework for characterizing students approaches to learning physics, which
had its basis in education research on learning orientations in addition to interviews he conducted with physics
students. The three categories of his framework are student
beliefs about the structure of physics (made up of pieces
versus coherent), beliefs about physics content (formulas
versus concepts), and beliefs about learning physics
(receiving knowledge versus actively making sense of
new information). Hammers framework was later used
to design multiple-choice attitude surveys.
2. Instructional practices and conceptions
Dancy and Henderson [523] introduce a framework for
describing instructional practices and conceptions. The
framework was developed from research literature, discussions within the PER community, and interviews with
physics faculty. The frameworks ten categories of conceptions include learning view (transmissionist versus
constructivist); beliefs about expertise; beliefs about knowledge; beliefs about the nature of physics; conceptions of
their role as an instructor; the role of the school; views of
students capacity to learn; desired outcomes (goals) for
students; diversity; and scientific literacy. The ten categories of practice identify behaviors typical of traditional
instruction compared to alternative instruction and include
interactivity (passive lecturing versus active student conversation); instructional decisions (made by teacher alone
versus with students); knowledge sources (students receive
1. Contexts
Studies involving students, such as identifying students
attitudes and beliefs about learning physics and the development of attitude surveys, have primarily taken place in
conjunction with introductory physics courses. In contrast,
studies involving instructors (faculty or teaching assistants)
often employ interviews that take place in a controlled
experimental setting outside the classroom or surveys that
are administered online.
2. Participants
Studies of students attitudes and beliefs about learning
physics have typically involved students enrolled in an
introductory course. Studies of instructors beliefs about
learning and teaching physics often involve faculty (from a
variety of institutional types) who are currently teaching a
course or who have substantial experience teaching physics. A very small number of studies involve graduate
teaching assistants in their first or second year of graduate
school.
3. Data sources and analysis
Clinical interviews. Research to identify the attitudes
and beliefs of students and instructors typically begins
with in-depth interviews that are audio and/or video
recorded. The transcribed statements are coded by
themes that are determined either by the researchers or
from existing theoretical frameworks. These methods
were used by Hammer [47,501,502] to describe
students beliefs about learning physics, and by
several researchers investigating instructors beliefs
[201,511514]. Sometimes these instructor interviews
include specific artifacts such as sample student or
instructor solutions that provide a basis for discussion.
Surveys. Web-based surveys have been used in research on faculty knowledge and use of researchbased instructional strategies [368,369]. The results
are analyzed using descriptive statistical measures,
such as reporting the average fraction of survey
respondents who reported using a particular researchbased instructional strategy.
Classroom observations. Observations of instruction
have been used in studies of teaching assistant
020119-36
SYNTHESIS OF DISCIPLINE-BASED
020119-37
020119-38
SYNTHESIS OF DISCIPLINE-BASED
020119-39
020119-40
SYNTHESIS OF DISCIPLINE-BASED
4. Teaching assistants
Graduate and undergraduate students are frequently
employed to teach laboratory and/or recitation sections
of introductory physics courses, especially at large institutions. The preparation and instructional support these
TAs receive varies widely across institutions, and to date
limited research has been done on teaching assistants
beliefs and practices.
Teaching assistants pedagogical beliefs and their
influence on instructor-student interactions. Goertzen,
Scherr, and Elby [519,520] and Spike and Finkelstein
[535] used interviews and videotaped observations
of teaching assistants to analyze their classroom
interactions while using Tutorials curriculum at the
University of ColoradoBoulder (CU) and the
University of Maryland (UM). Goertzen et al. found
that a lack of buy-in from the TA resulted in
behaviors that conflicted with the tutorial developers
intentions, such as giving direct answers to students
questions and increasing the use of equations as a
reasoning tool. Spike and Finkelstein concluded that
the beliefs and behaviors of teaching assistants had a
profound impact on the attitudes of students in the
class, such as their willingness to engage in group
work. Additional studies have investigated the pedagogical beliefs of graduate students and learning
assistants about problem solving [536], studio style
classrooms [537], interactive-engagement courses
[538], or an inquiry-based course for future elementary school teachers [539]. These studies have generally concluded that there is a wide range in teaching
assistant behaviors, which result in a variety of
different experiences for students in those classes.
The impact of instructional styles on student learning.
Koenig, Endorf, and Braun [521] studied recitation
020119-41
020119-42
SYNTHESIS OF DISCIPLINE-BASED
ACKNOWLEDGMENTS
We thank the National Research Council, and in particular Natalie Nielsen and Susan Singer from the NRCs
Committee on the Status, Contributions and Future
Directions of Discipline-Based Education Research, for
asking us to develop the original PER synthesis from which
this publication emerged and for their very helpful discussions that helped shape the structure of this synthesis.
We also thank Charles Henderson for encouraging us to
update the original NRC white paper for publication in
PRST-PER. Additionally, we thank the anonymous
reviewers for their helpful suggestions. We also thank
Paula Heron, Michael Wittmann, and Rachel Scherr for
inviting us to present a summary of our white paper at the
2011 Frontiers and Foundations of Physics Education
Research Conference, and Suzanne Brahmia for helping
to summarize the suggestions made by attendees at this
conference; we found the interactions and suggestions
made by the attendees of this conference extremely helpful.
Finally, we thank Jeanne Brunner, for her help in updating
our review of the literature. This material is based on work
supported by the National Science Foundation under Grant
No. DUE-1347722.
020119-43
020119-44
SYNTHESIS OF DISCIPLINE-BASED
[54]
[55]
[56]
[57]
[58]
[59]
[60]
[61]
[62]
[63]
[64]
[65]
[66]
[67]
[68]
[69]
[70]
[71]
020119-45
020119-46
SYNTHESIS OF DISCIPLINE-BASED
[132] B. H. Ross, Distinguishing types of superficial similarities: Different effects on the access and use of earlier
problems, J. Exp. Psychol. Learn. Mem. Cogn. 15, 456
(1989).
[133] L. M. Reeves and R. W. Weisberg, The role of content and
abstract information in analogical transfer, Psychol. Bull.
115, 381 (1994).
[134] S. Y. Lin and C. Singh, Using isomorphic problems to
learn introductory physics, Phys. Rev. ST Phys. Educ.
Res. 7, 020104 (2011).
[135] D. Hammer, A. Elby, R. Scherr, and E. F. Redish, in
Transfer of Learning from a Modern Multidisciplinary
Perspective, edited by J. Mestre (Information Age
Publishing, Greenwich, CT, 2005), pp. 89119.
[136] J. Tuminaro, A cognitive framework for analyzing and
describing introductory students use and understanding of
mathematics in physics, Ph.D. thesis, University of
Maryland, 2004 (unpublished).
[137] J. S. Brown, A. Collins, and P. Duguid, Situated
cognition and the culture of learning, Educ. Res. 18,
32 (1989).
[138] A. Mason and C. Singh, Revisiting categorization, in
Proceedings of the 2009 National Association for
Research in Science Teaching (NARST) Annual Meeting
(Garden Grove, CA, 2009) Symposium S2.2, pp. 321.
[139] F. Mateycik, D. Jonassen, and N. S. Rebello, Using
Similarity Rating Tasks to Assess Case Reuse in Problem
Solving, Proceedings of the Physics Education Research
Conference (Ann Arbor, MI, 2010).[AIP Conference
Proceedings 1179, 201, (2009)].
[140] S. F. Wolf, D. P. Dougherty, and G. Kortemeyer, Empirical approach to interpreting card-sorting data, Phys. Rev.
ST Phys. Educ. Res. 8, 010124 (2012).
[141] A. Carmichael, A. Larson, E. Gire, L. Loschky, and N. S.
Rebello, How does visual attention differ between experts
and novices on physics problems?, AIP Conf. Proc. 1289,
93 (2010).
[142] D. Rosengrant, A. Van Heuvelen, and E. Etkina, Do
students use and understand free-body diagrams?, Phys.
Rev. ST Phys. Educ. Res. 5, 010108 (2009).
[143] A. D. Smith, J. P. Mestre, and B. H. Ross, Eye gaze
patterns as students study worked out examples in
mechanics, Phys. Rev. ST Phys. Educ. Res. 6, 020118
(2010).
[144] T. de Jong and M. G. M. Ferguson-Hessler, Cognitive
structures of good and poor novice problem solvers in
physics, J. Educ. Psychol. 78, 279 (1986).
[145] J. L. Docktor, J. P. Mestre, and B. H. Ross, Impact of a
short intervention on novices categorization criteria,
Phys. Rev. ST Phys. Educ. Res. 8, 020102 (2012).
[146] S. F. Wolf, D. P. Dougherty, and G. Kortemeyer, Rigging
the deck: Selecting good problems for expert-novice
card-sorting experiments, Phys. Rev. ST Phys. Educ.
Res. 8, 020116 (2012).
[147] M. Finegold and R. Mass, Differences in the process of
solving physics problems between good problem solvers
and poor problem solvers, Res. Sci. Technol. Educ. 3, 59
(1985).
[148] L. N. Walsh, R. G. Howard, and B. Bowe, Phenomenographic study of students problem solving approaches in
020119-47
[149]
[150]
[151]
[152]
[153]
[154]
[155]
[156]
[157]
[158]
[159]
[160]
[161]
[162]
[163]
[164]
[165]
[166]
020119-48
SYNTHESIS OF DISCIPLINE-BASED
020119-49
[223] S. J. Pollock and N. D. Finkelstein, Sustaining educational reforms in introductory physics, Phys. Rev. ST
Phys. Educ. Res. 4, 010110 (2008).
[224] A. Elby, Helping physics students learn how to learn, Am.
J. Phys. 69, S54 (2001).
[225] M. C. Wittmann, R. N. Steinberg, E. F. Redish, and the
University of Maryland Physics Education Research
Group, Activity-Based Tutorials Volume 1: Introductory
Physics, The Physics Suite (John Wiley & Sons,
New York, 2004).
[226] M. C. Wittmann, R. N. Steinberg, E. F. Redish, and the
University of Maryland Physics Education Research
Group, Activity-Based Tutorials Volume 2: Modern
Physics (John Wiley & Sons, New York, 2004).
[227] D. R. Sokoloff, R. K. Thornton, and P. W. Laws,
RealTime Physics Active Learning Laboratories Module
1: Mechanics, 2nd ed. (John Wiley & Sons, Hoboken, NJ,
2004).
[228] D. R. Sokoloff, R. K. Thornton, and P. W. Laws,
RealTime Physics Active Learning Laboratories Module
2: Heat and Thermodynamics, 2nd ed. (John Wiley &
Sons, Hoboken, NJ, 2004).
[229] D. R. Sokoloff, R. K. Thornton, and P. W. Laws,
RealTime Physics Active Learning Laboratories Module
3: Electric Circuits, 2nd ed. (John Wiley & Sons,
Hoboken, NJ, 2004).
[230] D. R. Sokoloff, R. K. Thornton, and P. W. Laws,
RealTime Physics Active Learning Laboratories Module
4: Light and Optics, 2nd ed. (John Wiley & Sons,
Hoboken, NJ, 2004).
[231] R. J. Beichner, The impact of video motion analysis on
kinematics graph interpretation skills, Am. J. Phys. 64,
1272 (1996).
[232] E. Etkina, and A. Van Heuvelen, Investigative Science
Learning Environment A science process approach to
learning physics, in Reviews in PER Volume 1: ResearchBased Reform of University Physics, edited by E. F.
Redish and P. J. Cooney (American Association of
Physics Teachers, College Park, MD, 2007).
[233] E. Etkina, A. Karelina, M. Ruibal-Villasenor, D. Rosengrant,
R. Jordan, and C. E. Hmelo-Silver, Design and reflection
help students develop scientific abilities: Learning in
introductory physics laboratories, J. Learn. Sci. 19, 54
(2010).
[234] L. McCullough, The effect of introducing computers
into an introductory problem-solving laboratory,
Ph.D. thesis, University of Minnesota, 2000
(unpublished).
[235] R. Lippmann Kung, Teaching the concepts of measurement: An example of a concept-based laboratory course,
Am. J. Phys. 73, 771 (2005).
[236] B. J. Duch, S. E. Groh, and D. E. Allen, The Power of
Problem-Based Learning (Stylus Publishing, Sterling,
VA, 2001).
[237] M. A. Pease and D. Kuhn, Experimental analysis of the
effective components of problem-based learning, Sci.
Educ. 95, 57 (2011).
[238] L. C. McDermott and The Physics Education Group at the
University of Washington, Physics by Inquiry (John
Wiley & Sons, New York, 1996), Vol. I.
020119-50
SYNTHESIS OF DISCIPLINE-BASED
[256]
[257]
[258]
[259]
[260]
[261]
[262]
[263]
[264]
[265]
[266]
[267]
[268]
[269]
[270]
[271]
[272]
[273]
020119-51
020119-52
SYNTHESIS OF DISCIPLINE-BASED
020119-53
[390]
[391]
[392]
[393]
[394]
[395]
[396]
[397]
[398]
[399]
[400]
[401]
[402]
[403]
[404]
[405]
[406]
[407]
020119-54
SYNTHESIS OF DISCIPLINE-BASED
[426] S. Bonham, Reliability, compliance, and security in webbased course assessments, Phys. Rev. ST Phys. Educ.
Res. 4, 010106 (2008).
[427] J. Stewart, H. Griffin, and G. Stewart, Context sensitivity
in the force concept inventory, Phys. Rev. ST Phys. Educ.
Res. 3, 010102 (2007).
[428] B. W. Frank and R. E. Scherr, Interactional processes for
stabilizing conceptual coherences in physics, Phys. Rev.
ST Phys. Educ. Res. 8, 020101 (2012).
[429] A. B. Champagne and L. E. Klopfer, A causal model of
students achievement in a college physics course, J. Res.
Sci. Teach. 19, 299 (1982).
[430] A. B. Champagne, L. E. Klopfer, and J. H. Anderson,
Factors influencing the learning of classical mechanics,
Am. J. Phys. 48, 1074 (1980).
[431] J. Docktor and K. Heller, Gender differences in both force
concept inventory and introductory physics performance,
AIP Conf. Proc. 1064, 15 (2008).
[432] Z. Hazari, R. H. Tai, and P. M. Sadler, Gender differences
in introductory physics performance: The influence of
high school physics preparation and affective factors, Sci.
Educ. 91, 847 (2007).
[433] P. M. Sadler and R. H. Tai, Success in introductory
college physics: The role of high school preparation,
Sci. Educ. 85, 111 (2001).
[434] R. H. Tai and P. M. Sadler, Gender differences in introductory undergraduate physics performance: University
physics versus college physics in the USA, Int. J. Sci.
Educ. 23, 1017 (2001).
[435] L. E. Kost-Smith, S. J. Pollock, and N. D. Finkelstein,
Gender disparities in second-semester college physics:
The incremental effects of a smog of bias, Phys. Rev.
ST Phys. Educ. Res. 6, 020112 (2010).
[436] A. Miyake, L. E. Kost-Smith, N. D. Finkelstein, S. J.
Pollock, G. L. Cohen, and T. A. Ito, Reducing the gender
achievement gap in college science: A classroom study of
values affirmation, Science 330, 1234 (2010).
[437] W. Fakcharoenphol, E. Potter, and T. Stelzer, What
students learn when studying physics practice exam
problems, Phys. Rev. ST Phys. Educ. Res. 7, 010107
(2011).
[438] D. J. Palazzo, Y.-L. Lee, R. Warnakulasooriya, and D. E.
Pritchard, Patterns, correlates, and reduction of homework copying, Phys. Rev. ST Phys. Educ. Res. 6, 010104
(2010).
[439] K. Cummings and J. D. Marx, Beta-test data on an
assessment of textbook problem solving ability: An
argument for right/wrong grading?, AIP Conf. Proc.
1289, 113 (2010).
[440] J. Stewart and S. Ballard, Effect of written presentation on
performance in introductory physics, Phys. Rev. ST Phys.
Educ. Res. 6, 020120 (2010).
[441] M. A. McDaniel, H. L. Roediger III, and K. B.
McDermott, Generalizing test-enhanced learning from
the laboratory to the classroom, Psychon. Bull. Rev.
14, 200 (2007).
[442] For a discussion of the relationship between disciplinebased education research and other research areas such as
educational psychology and cognitive science, refer to the
introduction of Ref. [2].
020119-55
020119-56
SYNTHESIS OF DISCIPLINE-BASED
[480]
[481]
[482]
[483]
[484]
[485]
[486]
[487]
[488]
[489]
[490]
[491]
[492]
[493]
[494]
020119-57
[527]
[528]
[529]
[530]
[531]
[532]
[533]
[534]
[535]
[536]
[537]
[538]
[539]
020119-58