Learning how to Teach Experiments in the School Physics laboratory

Contributed by:
kevin
Physics is not only theory, concepts, laws, and formulas. It is also experimental science. Laboratory work is at the heart of physics. In Turkey, although the National Curriculum includes experiments, physics classes often do not include experimentation. Requirements for high-stake exams restrict teachers’ instruction time available to teach lab work.
1. Journal of Physics: Conference Series
PAPER • OPEN ACCESS You may also like
- Conceptual understanding procedure to
Learning how to teach experiments in the school elicit metacognition with pre-service
physics teachers
physics laboratory Jared Carpendale and Rebecca Cooper
- School physics for the 1980s
Ken Thomas
To cite this article: O Gkioka 2019 J. Phys.: Conf. Ser. 1286 012016
- Language gap between college
introductory physics textbooks and high
school physics textbooks
Eunjeong Yun
View the article online for updates and enhancements.
This content was downloaded from IP address 203.115.91.138 on 10/01/2022 at 19:35
2. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
Learning how to teach experiments in the school physics
O Gkioka
Department of Secondary Mathematics and Science Education, Bogazici University,
Istanbul, Turkey
[email protected]
Abstract. The focus of this research is in the broader area of physics teacher education. The aim
is to investigate physics teachers’ efforts to learn how to teach and assess experiments, and in
particular, concepts related to scientific evidence in the school physics laboratory. The study has
looked at the participants as both learners and prospective teachers. It has taken place within the
context of the course “Secondary Science Lab Applications” within a pre-service teacher
education program in a Department of Secondary Science and Mathematics Education. Twenty-
four students participated. The participants: a) revised the main concepts related to scientific
practices (in particular, experimental validity and reliability of measurement), b) developed
lesson plans, teaching and assessment methods and, c) taught school physics experiments.
Interviews were conducted with the participants during their preparation for teaching and after
teaching. They were observed when teaching and all classes were videotaped. A qualitative and
quantitative data analysis identified particular trends among the participants. Students’
difficulties while they designed, carried out experiments and wrote lab reports were identified,
as well as the difficulties they experienced when teaching experiments.
1. Introduction
Physics is not only theory, concepts, laws and formulas. It is also an experimental science. Laboratory
work is at the heart of physics. In Turkey, although the National Curriculum includes experiments,
physics classes often do not include experimentation. Requirements for high-stake exams restrict
teachers’ instruction time available to teach lab work.
In addition, in pre-service teacher education, major emphasis is given to conceptual understanding
of physics and concepts across the curriculum. In our department, an initiative was taken to prepare our
student teachers to teach experiments effectively in the school physics laboratory: to address the
knowledge and understandings required to teach scientific skills and methods explicitly. The goal has
been to elicit pre-service teachers’ understandings of scientific evidence and, through targeted
instruction, to support them to revise such understandings and learn how to write laboratory reports.
2. Research into secondary students’ difficulties with laboratory work
In physics education, many secondary students experience difficulties in the laboratory. The literature
provides a range of difficulties that secondary pupils encounter in the school science laboratory [1–5].
Many research studies support the lack of a fundamental understanding of scientific evidence in
secondary students. For example, studies on performance on lab work questioned the assumption that
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
3. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
students understand the nature of measurement and experimentation [6,7]. This work has led to the
development and refinement of various laboratory teaching materials [8]. After completing a traditional
laboratory course, the majority of students have ideas about measurement that are inconsistent with the
generally accepted scientific model. For example, as Séré and her team found, a large proportion of
students view the ideal outcome of a single measurement as an “exact” or “point-like” value. Only if a
measurement is considered really “bad” would it be reported in terms of an interval [6].
Boudreaux et al investigated the ability of students to reason on the basis of the control of variables
[9]. The participants were undergraduate physics students and they were asked to decide whether or not
a given variable influences the behaviour of a system. They found out that although most of the students
recognized the need to control variables, many had significant difficulty with the underlying reasoning.
It was argued that teachers should be prepared for how to teach experiments and develop science
practices in secondary school students.
Yet, little is known about pre-service teachers’ understanding of scientific evidence. In physics
departments it is often assumed that the participation of physics students in regular undergraduate
science laboratory courses provides them with the required knowledge and skills to teach in a school
laboratory [10]. This assumption stands in parallel with the belief that the more rigid and difficult
modules one attends in undergraduate programs, the better and well-prepared the physics teacher will
be [11]. There is a need to develop ‘special’ courses for physics teachers’ preparation since teachers
need to have deeper knowledge than their students.
The aim of this research study was to investigate physics teachers’ efforts to learn how to teach and
assess experiments, and in particular, concepts related to scientific evidence in the school physics
laboratory. The study looked at the participants as both learners [12,13] and prospective teachers [14,15].
3. Context of the research study and research questions
In Turkey, the secondary science curriculum gives much attention to conceptual knowledge, leaving
little space for teaching of experiments. Although the National Curriculum includes experiments,
physics classes often do not include experimentation. Requirements for high-stake exams restrict
teachers’ instruction time available to teach lab work. In our department, an initiative was taken to
prepare our student teachers for how to teach experiments effectively in the school physics laboratory:
to address the knowledge and understandings required to teach scientific skills and methods explicitly.
The goal has been to develop a good understanding of what makes ‘good scientific evidence’ firstly, in
teachers and secondly, to their secondary school students. Two research questions have guided the study:
1) What are the understandings of scientific evidence that pre-service physics teachers demonstrate
when they conduct experiments?
2) What are the difficulties and challenges that they experience when they teach and assess
laboratory work?
The study has taken place within the context of the course “Secondary Science Lab Applications”
within a pre-service teacher education program in a Department of Secondary Science and Mathematics
Education. Twenty-four pre-service physics teachers participated for one academic year. The
participants firstly revised the main concepts related to scientific practices (experimental validity,
planning and design of experiments, identification of variables, controlled experiment, measurement
and uncertainty, identification of errors and sources of errors, reliability of measurement) taught in
physics lab classes. Secondly, they carried out school experiments (i.e., Hooke’s law, simple pendulum
motion, insulation experiment, experiments with simple electrical circuits, refraction and reflection,
friction, motion on an inclined plane and electromagnetic induction). Then, they learned about effective
teaching and assessment methods (criteria and feedback) in the lab, informed by research. Finally, they
taught a 40-minute lesson including an experiment to their peers.
2
4. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
4. Research methodology
Individual interviews based on tasks specifically designed to look at ideas related to scientific evidence
were conducted. Students’ difficulties while they design, carry out experiments and write lab reports
have been identified. All experiments and discussion in the lab were videotaped. Mid-term and final
exam items are other important sources of data to answer the first research question.
In the second part of the semester, the participants have the opportunity to prepare lesson plans and
teach one lesson to their peers. Lessons were videotaped and interviews (during their preparation and
after their teaching) were conducted to identify prospective teachers’ difficulties. Their lesson plans and
the tasks they developed for their teaching and assessment were collected to be analysed and answer the
second research question.
The participants were given samples of actual secondary school students’ work and they were asked
to give written feedback to them. Research studies give much importance on teachers’ ability to give
constructive feedback so that learners understand what they need to do to improve [16,17]. Teachers
should be skillful in assessing lab skills and, more generally, students’ performance in the laboratory
[18–21]. In addition, a research study by Waren Little et al reported on the benefits for teachers’ learning
and professional development when teachers look at actual student work and think about students’
performance [22]. Also, Crespo found evidence that by examining students’ work, teachers have the
opportunity to learn about students’ thinking and practices [23].
4.1. Data collection and analysis
The reliability of the analysis is based on the triangulation of the methods employed. Qualitative data
analysis [24] was conducted to identify particular trends among the participants. We generated initial
categories from interviews, observation transcripts and exam item responses of each teacher. We
constantly compared new data from the interviews, and from the observations and exam items (mid-
term and final) with the current categories, and refined them. When clarification was needed, we
collected more data by conducting focused conversations with the teachers. The data were analyzed by
comparing the responses for each question both across and the interviewees and through each interview
in order to identify key categories and features among teachers and possible changes within the same
teacher. In the next phase, analysis results of the individuals were compared to identify common themes.
Triangulation was also applied by comparing and discussing the interpretations by the main researcher
and the two project assistants.
5. Results
5.1. Student understanding of the fair test
Students had difficulties in planning a fair test or a controlled experiment. One of the difficulties was
that they could not identify all relevant variables involved and then decide which variables should be
independent, constant, or dependent in their design. The ability to control some variables to design a fair
test and related understandings was documented by asking the participants to design a controlled
experiment and by looking at their reports when experiments were performed. Most importantly,
participants were presented with tasks and asked to evaluate other experimenters’ conclusions. Such
tasks were used as research instruments. For example, the following task asks students to evaluate
conclusions by previous experimenters who performed an experiment. Notice that students were asked
to interpret conclusions based on the design of a controlled experiment. Thus, they were not explicitly
asked to examine whether the experiment is a fair test; a basic requirement for the evaluation.
3
5. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
Task 1: Fair Test
Ali and Selin wanted to find out the relationship between the resistance and the cross-sectional area of
a constantan wire. They performed a series of measurements for six pieces of constantan wire with
different cross sectional areas, and reported these in the table above.
a) What are the variables involved in this experiment?
b) Based on their data, Ali and Selin concluded that the resistance is inversely proportional to the cross-
section. Do you agree?
c) What is the total percentage error involved in the measurement of the resistance?
d) How can this experiment be improved?
The respondents gave various answers along with different explanations. The majority of students
agree with the conclusion (of Ali and Selin), without examining whether the experiment is controlled or
a fair test.
Many students did not mention the idea of the fair test and they did not consider whether the design
is that of a fair test along with variables, which are involved in the experiment. In fact, they do not
mention that there is no information given about length of the wire, temperature and material type. Thus,
most of them replied: “I trust the data, because their conclusion is correct; according to the theory,
according to what I know”. Similarly, another student wrote: “I agree with their conclusion, because
their conclusion matches with the theory”; “I agree with this conclusion, because I learned it from
theory”. Those students do not look at the data and the procedure followed to examine whether the
experiment is a controlled one. For improvement, they only recommend that they need to repeat without
explaining how repetition of measurements improve the results.
When students are asked to evaluate a claim that is consistent with what they have been taught
(theory), they agree without examining the design of the test and in fact, the validity of the experiment
(whether it is actually an adequate fair test). Without examining whether the experiment is a controlled
experiment, they report that they trust the data. Thus, although one student mentioned that more
variables (as length) should be included, she did not examine whether it is a fair test or not, because
their conclusion is consistent with theory. “I agree with this claim since I know it from the theory. If
someone does not know the theory, he cannot conclude this from data set because it is not a fair test”.
“It is not a fair test, but still I agree with the conclusion, because I know from theory”.
Only a few students reported that they do not agree with the conclusion because there is missing
information about the length of the wire and the temperature. Such information is crucial when one
makes judgements about whether the experiment is a controlled experiment: “We do not have
4
6. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
information about the variable of length. We do not know that it is a fair test. We need information about
length and that length is constant (so that it is a controlled experiment)”. “We cannot conclude this
because the experiment is not a fair test, they did not report on the length of the wires, which is also an
independent variable to determine its resistance …”; “This is not a fair test and it should be a fair test
in order to be able to make conclusions”; “We need to have a fair test. Length also needs to be
considered”; or, better, “This experiment is not valid because information about length of the wire is
missing from our data table. Without it, as a variable, we cannot conclude with any statement like this,
since length could be different, and this would affect the conclusion. I do not agree with the two
students”. On this basis they suggest that the experiment needs to be controlled.
For the twenty-four students the results are shown in Table 1. These categories are not mutually
Table 1: Student understanding of fair test in the first two weeks of the semester.
categories no of students
no mention of the concept of fair test 22
no mention of all the variables involved (length of wires and 19
temperature)
“We know from theory that the resistance is inversely 23
proportional to cross-sectional area”
This is not a fair test. They should make it a fair test. 2
Student understanding of fair test improved only after targeted instruction and work on more tasks
with secondhand data asking them to evaluate the procedure and experimental evidence. In later tasks,
they think about the controlled experiment and answer that they need more information about some
variables so that they know whether the experiment is a controlled one or not.
5.2. Student difficulties in designing a controlled experiment
Our students demonstrated difficulty in recognizing the variables that contribute to an experimental
result. For example, this is the case when they are asked about the variables that are involved in the
insulation experiment.
Task 2: The insulation experiment
You, as a physics student, are investigating how the temperature of hot water drops in three similar cans
each wrapped with three different insulating materials.
1. What variables do you think affect the temperature of hot water in a can wrapped with an insulating
material?
2. How does each variable affect the temperature of hot water while it is cooling down and its rate of
change?
I would like you to design an experiment that would allow you to decide which of the three insulating
materials is best. After you are finished, you need to describe how you plan your experiment, obtain
evidence, analyze and explain your data, as well as how you evaluate your experiment.
Student difficulties in designing the fair test are recalled in their lab reports, too. For example, they
did not include all the variables involved and did not control all the variables in the insulation
experiment. This is due to the fact that when performing experiments in lab classes in the physics
department, students are always provided with guidelines and instructions on how to proceed. In the
insulation experiment, they need to include the following variables: amount of water, temperature drop/
time, starting temperature, material of the container, room temperature, number of layers, and thickness
of insulation layer.
5
7. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
Figure 1: The insulation experiment.
5.3. The participants giving feedback to laboratory reports as prospective teachers
The participants were asked to give feedback to secondary student lab reports. The reports were actual
reports taken from secondary schools.
Task 3: You, giving feedback as a teacher
Give your written feedback to a secondary school student (grade 11) who submitted the attached
laboratory report (the report is attached).
In giving feedback, the participants would give information about what needs to get improved and how:
• “You need to draw a graph and use it”.
• “You need to collect better data / results so that you draw a better graph and be able to look
at the pattern”; “Theory is missing”, or, “You need to improve your theory”.
• “You need to describe your method. What are the variables involved in your experiment?”
• “Nice description of the procedure” “The fair test is OK, but more measurements are needed”;
“The data table is good enough”; “Why this number of measurements?”
• “Variables (depended and independent) are missing”; “Points are plotted properly”.
• “You need to draw a graph”; “What does the graph tell you about the whole pattern?”
• “Analysis is missing. What are the different rates of cooling?”
• “You need to write a statement about the relationship between the two variables”; “You need
to give the graph a title”.
Giving feedback helps pre-service teachers develop a good understanding of the quality of a lab
report and what is included in each section. Also, the process helps them make the transition from
undergraduate student to practicing teacher. They reported that they enjoy grading and giving feedback.
It is a process through which they learn how to improve the quality of their own lab reports.
The assessment criteria have been developed during the semester. Through practice and by time, they
can give detailed feedback.
Figure 2. Interaction between giving feedback to student lab reports and improvement of lab reports.
6
8. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
5.4. Our students as prospective teachers: difficulties in preparing and planning to teach
Preparing to teach in the laboratory presents a great challenge. Our students are not confident to teach
laboratory skills and their plans are weak. The difficulty to prepare a lesson plan with learning goals
related to the development of particular experimental skills is clear. Tasks like the following were
distributed in the class, as homework and in the exam papers. Such tasks were developed in order to
develop teaching methods and assessment methods closely related to experimental skills.
Task 4: You, as a teacher
Prepare a lesson plan to teach your students how to plan (planning part) and take measurements
(obtaining evidence) in the insulation experiment.
Task 5: You, as a teacher
Write a set of rubrics for assessment of the design an experiment and for making a prediction of the
outcome of the experiment.
Task 6: You, as a physics teacher
You need to write a set of rubrics for assessment of the “analysis and explanation of experimental
results” in an experiment and the lab report. The set of rubrics should help teachers in their teaching and
in the assessment of the “analysis and explanation” part. In addition, secondary students will use the set
of rubrics in peer- and self- assessment of analysis and explanation.
When working on the insulation task (Task 2), the participants talked only about the theory of
insulation, heat and temperature, heat capacity and Newton’s law of cooling. They also sketched the
graph of the cooling down of hot water. Similarly, while they were preparing to teach Ohm’s law
experiment, students wrote that the topic is about Ohm’s law (not about the teaching of experimental
skills). And the lesson objective is: “By the end of the lesson, students will have understood the relation
between voltage and current and the associated Ohm’s law”. Similarly, “Students will be able to identify
and determine the difference between heat and temperature”. And, for the insulation and the insulation
experiment: “Students will learn what an insulator is”; “Understand how insulators work”. They
confuse the context (theory) with the experimental/laboratory skills they want to teach and develop in
their secondary students. Instead, we wanted them to talk about the preliminary experiment and the class
discussion, in which, students talk in groups about planning, variables and the planning of a controlled
experiment. Similarly, in their lesson plans they describe the different parts of an experiment, when they
need to write what the teacher and the students will do (teaching methods and strategies) in the laboratory
or in the classroom.
Only a few and senior students (not more than 3 from the total 24) answered that they will teach the
theory of the experiment (Newton’s cooling law) in the introductory lesson. They also said that they are
going to review each part of the experiment and what to include in the lab report. “Use instruments to
make measurements of temperature”; “Carry out a controlled experiment”. Other teaching goals and
student/ learning objectives are: “Students will be able to draw a best fit line/ best fit curve”.
Most students had difficulties in preparing a lesson plan to teach students how to plan and design a
fair test or, in preparing a lesson to help them improve the analysis of evidence. Through practice they
• “Teach to judge the range of measurements they need to take, when and why these need to be
repeated, and how to deal with anomalous or discrepant results”
• “Teach students to examine evidence for validity and reliability by considering questions of
accuracy, error and discrepancy”
• “Teach students how to draw the best fit line or best fit curve”
• “Teach how to use the graph in the analysis of results”
• “How to design a fair test”; “How to collect and record data”
7
9. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
• “How to analyze data”; “How to make a detailed analysis of results by using the plotted graph
and make the calculation of slopes”.
Students experienced difficulties in writing learning goals and objectives and teaching goals related
to the development of lab skills. This is mainly because in all education and physics education classes,
they have had training and practice in developing lessons plans to teach only concepts of physics
(theory). At the end of the semester were they are able to articulate and write some teaching goals:
• “Students learn how to plan the insulation experiment”
• “How to take measurements while performing the experiment”
• “In the preliminary experiment, they will make decisions about...”
• “Students to be able to plan the insulation experiment, to identify the variables involved, to
carry out the preliminary experiment and how to design a controlled experiment”
They also developed assessment methods; they wrote assessment goals, were introduced to
performance assessment, and developed assessment criteria for feedback and rubrics for grading. When
it comes them to prepare a lesson to address specific weaknesses (as demonstrated in a lab report), they
do not know how to match the learning goals with laboratory activities. One senior student with
considerable teaching experience in internship schools wrote in a reflection journal: “Planning a lesson
was difficult for me. I needed some guidance to design and plan my lesson. I keep the theory simple, so
that I teach them laboratory skills. I learned about progression in teaching (teaching sequences)”.
One of the important issues in their teaching is that pre-service teachers do not seem to understand
the common components of controlled experiments. There is also an interplay between theory (the
context of the experiment) and the development of lab skills, which makes them attach more importance
to the underlying theory than to the development of experimental skills, when asked to evaluate
Pre-service teachers have difficulties in connecting what they learn with teaching practice. Our
participants were not successful in applying what they learned in the course to prepare their lesson plans
and do the actual teaching. They experienced particular difficulties when preparing lessons plans for
teaching in the lab, when writing lesson objectives and so on. Only a few of from the total of twenty-
four students had prepared adequate lesson plans.
6. Discussion of results and conclusions
This study has provided strong evidence about the difficulties and needs of our participants. The aim
was to document pre-service physics teachers’ difficulties in understanding scientific evidence and then,
difficulties in teaching and assessing lab skills and experiments in the school laboratory. Tasks were
developed to elicit such understandings and challenges. Conversely, the findings guided the
development of tasks and teaching materials to explicitly address such deficiencies. In addition, the
study confirmed research results from previous studies, but also elicited new findings.
Firstly, the results show that teachers have a limited understanding of experimental design and
concepts of validity and reliability. Similar to findings from two studies [6, 25] the participant students
have a limited understanding of the uncertainty that is inherent to each measurement. There is strong
belief in making one measurement carefully [1]. The participants struggled with many aspects of
scientific evidence and with writing laboratory reports. Through discussion in the interviews and
feedback by the instructor, students improve the versions of lab reports and submit lab reports of better
quality for subsequent experiments. The structure of the reports and what is included in each section
improves across subsequent lab experiments for each participant.
Secondly, the results indicate serious shortcomings in the preparation of future physics teachers. The
study revealed a lack of knowledge related to experimental skills. Pre-service teachers have difficulties
in connecting what they learn with teaching practice. Our participants were not successful in applying
what they learned in the course to prepare their lesson plans and do the actual teaching. They experienced
particular difficulties when preparing lessons plans for teaching in the lab, when writing lesson
8
10. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
objectives and so on. Only a few of them had prepared adequate lesson plans. It is also true that the
participants have had different teaching experiences prior to joining the research study.
One strength of the project is that with various research instruments and targeted instruction
succeeded to elicit student understanding along with the underlying reasoning. In terms of research
methodology, the use of only interviews may not explore the full range of difficulties and related
reasoning. Moreover, the combination of interviews, targeted instruction, classroom discussion as well
as work on tasks brought up various misunderstandings and difficulties that students experience when
they design controlled experiments and they evaluate scientific evidence.
Secondly, an important achievement of the study has been to document student reasoning in enough
detail to enable our participants to improve through the design of effective instruction and then, to help
them practice effective instruction addressed to secondary school students. This project presented and
analyzed the types of understandings that physics teachers should have developed in order to teach and
assess experiments in the school lab. All these understandings constitute a body of knowledge for
teaching in the lab to help prospective physics teachers become familiar with “rules” of evaluating
experimental procedures and evidence [26]. Those types of understandings will support teachers in
developing better concepts of evidence and further, become confident with teaching in the lab.
This study raises important questions for physics teacher educators. Pre-service teachers need to
develop confidence in teaching physics experiments and in developing assessment goals and methods
related to laboratory skills. They need repeated practice in lab skills. There are implications for pre-
service physics teacher education, because many of the participants are expected to teach lab work in
secondary schools. The development of special courses for the preparation of physics teachers is
necessary. Adequate preparation of science teachers is vital to ensuring good teaching of science in
secondary schools. The standard lab classes in the physics departments are not enough. They do not
address the needs that they will have as teachers. Most importantly, our physics pre-service teachers are
not the best undergraduate students. Instead, it is usually the case that the low achievers come to the
faculty of education to become physics teachers. Standard instruction often does not improve student
understanding of scientific evidence. Trainee teachers experience difficulties and have needs which must
be considered in planning and implementing their initial education. Instruction that explicitly addresses
the underlying reasoning can have a significant effect to the development of scientific evidence.
The study was carried out as a start-up project funded by BAP 10800.
[1] Allie S and Buffler A 1998 A course in tools and procedures for Physics I Am. J. Phys. 66(7)
613–24. DOI: 10.1119/1.18915
[2] Hofstein A and Kind P M 2012 Learning in and from science laboratories. In: Fraser B, Tobin K
and McRobbie C (eds) Second International Handbook of Science Education. Springer
International Handbooks of Education, vol 24. Springer, Dordrecht (pp. 189–207). DOI:
10.1007/978-1-4020-9041-7_15
[3] Lunetta V N, Hofstein A and Clough M 2007 Learning and teaching in the school science
laboratory: An analysis of research, theory and practice. In: Lederman N and Abel S (eds)
Handbook of research on science education (pp. 393-441), Mahawah, NJ : Lawrence Erlbaum.
[4] Séré M-G 1999 Learning Science in the Laborator: Issues raised by the European Project
“Labwork in Science Education”. In: Bandiera M, Caravita S, Torracca E, Vicentini M (eds)
Research in Science Education in Europe. Springer, Dordrecht pp. 165–74. DOI: 10.1007/978-
94-015-9307-6_21
[5] Séré M-G, Journeaux R and Winther J 1998 Enquête sur la pratique des enseignants de lycée dans
le domaine des incertitudes [Investigate into the practice of the teachers of high school in the
field of uncertainties] Bulletin d’Union des Physiciens 801 247–254.
[6] Lubben F, Campbell B, Buffler A and Allie S 2001 Point and set reasoning in practical science
9
11. GIREP-ICPE-EPEC 2017 Conference IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1286 (2019) 012016 doi:10.1088/1742-6596/1286/1/012016
measurement by entering university freshmen Sci. Educ. 85(4) 311–327. DOI:
10.1002/sce.1012
[7] Séré M-G, Journeaux R and Larcher C 1993 Learning the statistical analysis of measurement
errors. Int. J. Sci. Educ. 15(4) 427–438. DOI: 10.1080/0950069930150406
[8] Allie S, Buffler A, Kaunda L, Campbell B and Lubben F 1998 First-year physics students’
perceptions of the quality of experimental measurements Int. J. Sci. Educ. 20(4) 447–459.
DOI:10.1080/0950069980200405
[9] Boudreaux A, Shaffer P S, Heron P R L and McDermott L C 2008 Student understanding of
control of variables: Deciding whether or not a variable influences the behavior of a system
Am. J. Phys. 76(2) 163–70. DOI: 10.1119/1.2805235
[10] McDermott L C 1990 A perspective on teacher preparation in physics and other sciences: The
need for special science courses for teachers Am. J. Phys. 58(8) 734–4. DOI: 10.1119/1.16395
[11] McDermott L C 2014 Melba Newell Phillips Medal Lecture 2013: Discipline-Based Education
Research—A View From Physics Am. J. Phys. 82(8) 729–41. DOI: 10.1119/1.4874856
[12] Bransford J D, Brown A and Cocking R 1999 How people learn: Mind, brain, experience, and
school (Washington DC: National Research Council)
[13] Darling-Hammond L, Bransford J, LePage P, Hammerness K and Duffy H 2007 Preparing
teachers for a changing world (Jossey-Bass).
[14] National Research Council 2005 America’s Lab Report: Investigations in High School Science.
Washington, DC: The National Academies Press.
[15] Wilson S et al (Eds) 2015 Science Teachers’ Learning: Enhancing Opportunities, Creating
Supportive Contexts (Washington, DC: The National Academies Press)
[16] Black P, Harrison C, Lee C, Marshall B and Wiliam D 2003 Assessment for Learning- Putting it
into practice (Maidenhead, U.K: Open University Press)
[17] Wiliam D and Leahy S 2015 Embedding formative assessment: Practical techniques for K-12
Classrooms (West Palm Beach, FL: Learning Sciences International)
[18] Brown J H and Shavelson R J 1996 Assessing hands-on science. Corwin Press: Thousand Oaks,
CA
[19] Gioka O 2006 Assessment for learning in physics investigations: assessment criteria, questions
and feedback in marking Phys. Educ. 41(4) 342–6
[20] Keys C W, Hand B, Prain V and Collins S 1999 Using the Science Writing Heuristic as a Tool
for Learning from Laboratory Investigations in Secondary Science J. Res. Sci. Teach. 36(10)
1065–84. DOI: 10.1002/(SICI)1098-2736(199912)36:10<1065::AID-TEA2>3.0.CO;2-I
[21] Mintzes J J, Wandersee J H and Novak J D (Eds) 2005 Assessing science understanding: A human
constructivist view. London: Elsevier Academic Press
[22] Waren Little J, Gearhart M, Curry M and Kafka J 2003 Looking at student work for teacher
learning, teacher community and school reform Phi Delta Kappan 185–92
[23] Crespo S 2000 Seeing More Than Right and Wrong Answers: Prospective Teachers’
Interpretations of Students’ Mathematical Work J. Math. Teach. Educ. 3(2) 155–81. DOI:
10.1023/A:1009999016764
[24] Miles M B and Huberman A M 1994 Qualitative data analysis: An expanded sourcebook. Sage
Publications, Inc., Thousand Oaks, CA
[25] Varelas M 1997 Third and fourth graders’ conceptions of repeated trials and best representatives
in science experiments J. Res. Sci. Teach. 9 853–872. DOI: 10.1002/(SICI)1098-
2736(199711)34:9<853::AID-TEA2>3.0.CO;2-T
[26] Thames M H and Ball D L 2010 What mathematical knowledge does teaching require? Knowing
mathematics in and for teaching Teach. Child. Math. 17(4) 220–229
10