Improving students’ help-seeking skills in an intelligent tutoring system

Contributed by:
Sharp Tutor
In the present article the focus is on help-seeking within an intelligent tutoring system, that is, a system that provides step-by-step guidance as students practice a complex problem-solving skill.
1. Learning and Instruction 21 (2011) 267e280
www.elsevier.com/locate/learninstruc
Improving students’ help-seeking skills using metacognitive feedback in an
intelligent tutoring system
Ido Roll a,*, Vincent Aleven b, Bruce M. McLaren b, Kenneth R. Koedinger b
a
Carl Wieman Science Education Initiative, University of British Columbia, 6224 Agricultural Road, Vancouver, British Columbia V6T 1R9, Canada
b
Human Computer Interaction Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh PA 15213, USA
The present research investigated whether immediate metacognitive feedback on students’ help-seeking errors can help students acquire
better help-seeking skills. The Help Tutor, an intelligent tutor agent for help seeking, was integrated into a commercial tutoring system for
geometry, the Geometry Cognitive Tutor. Study 1, with 58 students, found that the real-time assessment of students’ help-seeking behavior
correlated with other independent measures of help seeking, and that the Help Tutor improved students’ help-seeking behavior while learning
Geometry with the Geometry Cognitive Tutor. Study 2, with 67 students, evaluated more elaborated support that included, in addition to the Help
Tutor, also help-seeking instruction and support for self-assessment. The study replicated the effect found in Study 1. It was also found that the
improved help-seeking skills transferred to learning new domain-level content during the month following the intervention, while the help-
seeking support was no longer in effect. Implications for metacognitive tutoring are discussed.
Ó 2010 Elsevier Ltd. All rights reserved.
Keywords: Help seeking; Self-regulated learning; Cognitive tutors; Intelligent tutoring systems; Metacognition
1. Introduction & Koedinger, 2006; Renkl, 2002; Wood & Wood, 1999).
In the present article the focus is on help seeking within an
1.1. Help seeking in tutoring systems intelligent tutoring system, that is, a system that provides step-
by-step guidance as students practice a complex problem-
Knowing when and how to seek help during learning is solving skill (VanLehn, 2006).
a key self-regulatory skill (Nelson-Le Gall, 1981; Newman, Most tutoring systems offer help, often in the form of on-
1994; Pintrich, 2000). Research in classrooms suggests that demand contextual hints that explain how to solve the current
adaptive help-seeking behavior helps students learn more problem step and why (for a review, see Aleven, Stahl,
effectively (Arbreton, 1998; Karabenick & Newman, 2009; Schworm, Fischer, & Wallace, 2003). Students often can ask
Ryan & Shin, 2011). Help requests (and responses) can be for multiple levels of hints for each step (Aleven et al., 2006;
characterized as “instrumental” when they are aimed Karabenick & Newman, 2009; Mathews & Mitrovic, 2008).
at learning, or “executive,” when they are aimed merely at When asking for a hint, the student first receives a very general
completing tasks (Karabenick & Knapp, 1991). Effective help hint, intended to point out a good choice for the next problem
seeking has also been shown to be associated with better step. The student can then ask for more elaborated hints. These
learning in educational technologies (Aleven, McLaren, Roll, hints offer instrumental help by giving increasingly detailed
explanations of how to solve the step. The last level of hint
often gives the answer for the step, essentially converting
* Corresponding author. Tel.: þ1 778 938 4141; fax: þ1 604 822 2102. a too-challenging problem step into an annotated example.
E-mail address: ido@phas.ubc.ca (I. Roll). This last-level hint, often called “bottom-out hint”, allows
0959-4752/$ - see front matter Ó 2010 Elsevier Ltd. All rights reserved.
2. 268 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
Table 1 Cognitive Tutor family (although not the one described below)
Hint levels in the Geometry Cognitive Tutor. automatically present hints once a certain threshold number of
Hint Hint text Type of hint errors is reached, or enforce a delay of 2 s between repeated
level hint requests.
1. “Enter the value of the radius of circle A” Orientation While these approaches are likely to result in better help-
2. “How can you calculate the value of the Instrumental help seeking behavior while they are in effect, they are not likely to
radius of circle A given the value of the
diameter of circle A?”
help students become better independent help-seekers.
3. “The radius of a circle is half Instrumental help A more complete solution would be to help students
of the diameter” acquire better help-seeking skills. Robust learning of help-
4. “The radius of circle A ¼ 46.5” Bottom-out hint seeking skills would allow students to transfer these skills to
novel learning situations, when no explicit support for help
seeking is available. The research described in the present
students to complete problem steps that are beyond their article examined whether immediate feedback of a meta-
ability and to reflect on the solution process (Aleven et al., cognitive nature on students’ help-seeking behavior could
2006).1 Table 1 includes an example of the different levels achieve the goal of robust improvement in students’ help-
of hints for a typical problem step in an intelligent tutoring seeking skills. This was done by adding automated feedback
system, with Level 4 being the bottom-out hint. on students’ help-seeking errors to a commercial software, the
While effective help-seeking behavior has been shown to Geometry Cognitive Tutor (Koedinger & Corbett, 2006). At
be important for successful learning with tutoring systems, the cognitive (or domain) level, immediate feedback has been
students often make suboptimal use of the help resources these shown to improve learning (Corbett & Anderson, 2001). At the
systems offer (Aleven et al., 2003). Research suggests two metacognitive level, however, the effect of feedback on
main forms of help misuse, namely, help avoidance (under- students’ learning is understudied. Metacognitive feedback is
use) and help abuse (overuse). For example, students may defined as feedback that is triggered by students’ learning
avoid seeking help altogether, even when clearly in need of behavior (e.g., avoiding necessary help), and not by the
more guidance (Aleven et al., 2003). Conversely, students may accuracy of their responses at the domain level. Also, meta-
ask for elaborated help when no or little help is needed, or use cognitive feedback delivers metacognitive content, that is, it
help in executive ways aimed at finding answers without conveys information about desired learning behavior (e.g.,
thinking through the material (Aleven et al., 2006; Karabenick advising the student to ask for a hint), rather than domain
& Newman, 2009). One common form of executive help knowledge (Roll et al., 2007b). A small number of systems use
seeking is rapidly clicking through hints to reach the bottom- metacognitive feedback to direct students in their learning
out hint to get the answer (Baker, Corbett, Roll, & Koedinger, process (Shute & Glaser, 1990; Wagster, Tan, Wu, Biswas, &
2008). Notably, students who are most in need of help make Schwartz, 2007). However, to the best of our knowledge, the
poorer decisions regarding their help-seeking behavior effect of metacognitive feedback on students’ help-seeking
(Karabenick & Newman, 2009; Renkl, 2002; Wood & Wood, behavior is yet to be fully evaluated in controlled studies.
1999). Given the evidence that maladaptive help-seeking with Furthermore, we are not aware of any evaluation of the long-
tutoring systems is widespread, attempting to improve term effect of help-seeking training within the context of
students’ help-seeking behavior in a tutoring system may a tutoring system.
achieve lasting positive effects on students’ ability to learn
from such systems (Mercier & Frederiksen, 2007; Roll, 1.3. Identifying help-seeking errors
Aleven, McLaren, & Koedinger, 2007b).
To give metacognitive feedback, the tutoring system must
1.2. Improving students’ help-seeking behavior in be able to detect metacognitive errors in real time, without
tutoring systems interrupting the learning process. This was done by evaluating
students’ actions using a metacognitive computational model
One approach to improving the quality of students’ help of help seeking (Aleven et al., 2006). The help-seeking model
seeking while solving problems with a tutoring system is by evaluates help-seeking behaviors in a tutored step-based
delegating some of the responsibility to the system, rather than problem-solving environment such as the Geometry Cognitive
to the students. For example, a number of systems use a Tutor. Unlike other models of help seeking that have been put
contingent help mechanism in which the level of the hint is forward in the literature (Mercier & Frederiksen, 2007;
automatically adapted to the student’s knowledge level, as Nelson-Le Gall, 1981), this model is detailed enough to
estimated based on past performance (Luckin & du Boulay, classify individual actions as help-seeking desired or unde-
1999; Wood & Wood, 1999). Similarly, few tutors from the sired actions. Aleven et al. (2006) present a detailed compar-
ison of the model to other frameworks.
Following the Adaptive Control of Thought e Rational
1
The term “bottom-out hint” refers to the most elaborated hints that convey (ACT-R) theory of cognition and learning (Anderson, 1993), the
the answer. Students view higher-level hints prior to receiving a bottom-out help-seeking model describes help-seeking actions as outcomes
hint, and can re-read the higher-level hints after reading the bottom-out hint. of desirable and undesirable rules, taking the form of if-then
3. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 269
Fig. 1. The help-seeking model. This flowchart describes the main decision points of the complete help-seeking model. The full model is implemented using 80 if-
then rules.
statements (see Fig. 1; Aleven et al., 2006). For example, on students’ help-seeking skills. The paper focuses on the
according to the model, a student who does not have sufficient direct effect of help-seeking support on help-seeking skills,
knowledge to solve a certain step should ask for enough help to and only to a lesser degree on the indirect effect on domain-
understand the step (instrumental help-seeking). Conversely, learning gains. While the effect of given help on domain
asking for a bottom-out hint is not desirable when the student’s knowledge acquisition has been previously studied, the effect
knowledge level and help-seeking pattern suggest that the hint of automated metacognitive feedback on students’ help-
is to be used in an executive manner (for example, when seeking behavior is yet to be evaluated.
“drilling down” to bottom-out hint without reading intermediate The first research question focused on whether the help-
hint levels). Similarly, a student who has sufficient knowledge seeking model, which identifies desired and undesired help-
to solve a specific problem step should answer it without seeking behaviors, has construct validity. While the help-seeking
requesting help, and should do so deliberately (i.e., taking model was shown to correlate with pre to posttest improvements,
sufficient time to think). The help-seeking model predicts learning gains are only an indirect measure of metacognitive
desirable metacognitive actions in any given problem-solving behavior, and as such, cannot be the sole source of assessment of
situation within a cognitive tutor environment. Likewise, in any metacognitive behavior (MacLeod et al., 1996). To establish
given problem-solving situation within this environment, the construct validity, the current studies examined the relationship
help-seeking model is capable of predicting specific help- between poor help-seeking behavior in the cognitive tutor
seeking errors. The help-seeking model is implemented using (according to the help-seeking model) and poor help-seeking
80 if-then statements (or rules). These rules are inclusive, that behavior outside the tutoring environment (such as a paper
is, each action that a student does in the cognitive tutor is labeled posttest). The prediction was that the quality of students’ help-
as either desired or undesired help-seeking behavior. This is seeking behavior, as evaluated by the help-seeking model, would
done by comparing each action performed by the student correlate with other measures of help-seeking based on the paper
against those predicted by the help-seeking model, so as to posttest (Hypothesis 1).
automatically assess a student’s help-seeking behavior The second research question examined whether automated
moment-by-moment, over extended periods of time. The metacognitive feedback, in the context of tutored problem-
undesirable forms of help-seeking behavior captured by the solving, leads to a more appropriate help-seeking behavior
help-seeking model were shown to correlate with poor learning while this feedback is in effect. This was done by comparing
gains at the domain level across student populations and students’ behavior using the standard version of the Geometry
domains (Aleven et al., 2006). Cognitive Tutor to one that is enhanced with the Help Tutor,
a tutoring agent that provides immediate metacognitive feed-
1.4. Research questions and hypotheses back on students’ help-seeking behavior. The prediction was
that metacognitive feedback on help-seeking errors would
The aims of the present studies were to investigate the enhance students’ more appropriate help-seeking behavior
validity of the help-seeking model as implemented within the (such as avoiding redundant bottom-out hints) as compared to
Geometry Cognitive Tutor and to evaluate the effect of met- help-seeking behavior of students who do not receive such
acognitive feedback that is based on the help-seeking model feedback (Hypothesis 2).
4. 270 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
The third research question examined whether the effect of balanced with respect to students’ previous achievement in the
the metacognitive feedback (in terms of improved help-seeking Geometry Cognitive Tutor class.
behavior) transfers to learning new domain content within the
same learning environment, even after the support of the Help 2.1.2. Participants
Tutor is no longer in effect. The hypothesis was that the acquired 58 students who were enrolled in a Geometry Cognitive
help-seeking skills would transfer to new domain-level content Tutor class participated in the study. Two classes, with 28
even in the absence of the help-seeking feedback. At the same students, taught by one teacher, were from an inner-city school
time only limited transfer would be found to new tasks outside in Pittsburgh (97% minorities, 22% mathematical proficiency,
the Geometry Cognitive Tutor, such as a help-seeking dilemmas 48% male to 52% female). A different teacher in a suburban
paper test (Hypothesis 3). school (2% minorities, 84% mathematical proficiency, 51%
In what follows, first Study 1 is described, in which the male to 49% female) taught the other two classes, with 30
construct validity of the help-seeking model and the effect of students. All students were in Grades 10e11 (15e17 years
metacognitive feedback were evaluated. Then, Study 2 is old). Data regarding proficiency, ethnicity, and gender refers to
described in which the transferability of the acquired help- the entire cohort and is representative of the classes we worked
seeking skills was examined. with.
2. Study 1: Validating the help-seeking model and 2.1.3. Materials
evaluating the effect of the help-seeking feedback During the study, all participating students worked on a unit
in the Geometry Cognitive Tutor dealing with the geometric
Study 1 tested Hypotheses 1 and 2. properties of circles. As intelligent tutoring systems typically
do (VanLehn, 2006), the Geometry Cognitive Tutor provides
2.1. Method step-by-step guidance with a complex problem-solving skill.
It uses a cognitive model to track students in their various
2.1.1. Design approaches through these problems, estimate their knowledge
Half of the participating students worked with the Help level, and provide domain-level feedback and on-demand hints
Tutor, an enhanced version of the Geometry Cognitive Tutor regarding the steps of these problems. Fig. 2 shows the
that is capable of giving immediate metacognitive feedback interface of the tutor used in this study. In the Scenario
on students’ help-seeking errors (Help condition). The other window the tutor presents a problem statement and the
half of the students worked with the unmodified Geometry students type their answers (on the left). In the upper-right-
Cognitive Tutor (Control condition). Students were assigned hand corner students can see their estimated proficiency level
to condition under the constraint that the conditions were on each skill. The Geometry Cognitive Tutor has two built-in
Fig. 2. The Geometry Cognitive Tutor interface.
5. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 271
help-seeking mechanisms: on-demand contextual hints and 2.1.4. Instruments
a glossary. Students can ask for contextual hints at increasing To address the first research question, namely evaluating
levels of detail, as described above, by clicking on the hint the construct validity of the help-seeking model, students’
button (marked “?”). The second help resource is a searchable help-seeking behavior was assessed using a paper-and-pencil
Glossary that contains all relevant theorems, illustrated with posttest. The posttest included 9 geometry problems with 18
examples (bottom right corner). problem steps (see example in Fig. 4). Most of these problems
In addition to the domain-level feedback offered by the included a procedural component, in which students were
Geometry Cognitive Tutor itself, half of the students in the asked to calculate the measure of an angle, a circle arc, or
study received metacognitive feedback from the Help Tutor. a circle chord, and a declarative component, in which students
The Help Tutor is an add-on tutor agent to the Geometry were asked to name the theorem they used. To evaluate help-
Cognitive Tutor. It teaches help-seeking skills by giving met- seeking behavior within the posttest, independent of domain
acognitive feedback on students’ help-seeking errors in the knowledge, all items in the test appeared in two versions,
context of learning a domain-specific problem-solving skill. counter balanced between forms: with or without hints. Items
The Help Tutor uses the help-seeking model described above to without hints (i.e., conventional test items) were used to assess
trigger its feedback. When the student commits a help-seeking the difficulty level of the specific test item for the specific
error (for example, by “drilling down” to the bottom-out hint group of students. The other type of items included hints, as
without reading intermediate hints), the Help Tutor gives seen in Fig. 4. The difference between students’ performance
immediate feedback (e.g., “No need to hurry so much. Take on identical items with and without hints (for the same group
your time and read the hint carefully. Consider trying to solve of students) is a direct measure of the quality of students’ hint
this step without another hint. You should be able to do so”; see comprehension and application abilities.
Fig. 3.). The Help Tutor messages include only domain- A different type of item was used to assess students’ ability
independent metacognitive content for several reasons: to to identify desired help-seeking behavior. In these items,
encourage students to focus more on the metacognitive feed- students were asked to choose the most appropriate action in
back (and not be distracted by domain content), to help hypothetical help-seeking dilemmas (cf., Al-Hilawani, 2003).
students generalize the help-seeking skills, and to make the These situations described common situations that students
Help Tutor reusable with different Cognitive Tutors. The Help encounter when working with the Geometry Cognitive Tutor
Tutor messages use the existing hint-window mechanism and adhere to the help-seeking strategies put forward by the
and are distinguished from regular hints in their font (color help-seeking model. For example, the hypothetical situation
and type). described in Fig. 5 suggests that the student knows the target
Fig. 3. The Help Tutor feedback messages.
6. 272 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
Fig. 4. Pre and posttest items.
material, and yet, has not tried to correct her own error. The are reported elsewhere (Roll et al., 2006), as well as the fact
help-seeking model suggests that at this point the student that the present article focuses on metacognitive learning
should attempt to review her calculations prior to asking for results (and not an indirect improvement in domain learning),
a hint. Note that these items do not always assume that more the domain results are not included here. Yet, to convey
help is better. For example, the situation described in Fig. 5 a complete picture of the effect of the given metacognitive
suggests that less help is a more appropriate action. To feedback, domain-learning outcomes are discussed in Section
reduce guessing, students were given 1 point for each correct 2.3.2, the Effect of the Help Tutor.
answer and 1 for each wrong answer.
To address the second research question, the effect of 2.1.5. Procedure
the metacognitive feedback, students’ help-seeking behavior Students worked with the Geometry Cognitive Tutor for six
while working with the Geometry Cognitive Tutor was eval- 45-min sessions over three weeks. No instruction on help
uated using the help-seeking model. In addition to triggering seeking or circles was given in advance. The tutor recorded
feedback, the output of the help-seeking model can be used detailed log files of students’ interactions with the tutor. The
to calculate error rate for different types of help-seeking recorded information included attempted solutions, use of help
errors. resources, and time stamps. Pre and posttests were collected
A secondary goal of the study was to evaluate the effect of on paper on the first and last day of the study. The students in
metacognitive feedback on domain-level learning gains (in the inner-city school did not complete the posttest and, thus,
this case, the properties of circles). Since the domain results only data logged by the Cognitive Tutor is available for
7. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 273
Fig. 5. Hypothetical help-seeking dilemmas assessment.
students in these classes. These logs include online help- 2.2.2. Effect of the metacognitive feedback on students’
seeking and domain-level problem-solving data. behaviour in the tutor
The second aim of Study 1 was to evaluate how the meta-
cognitive feedback students received affected their online
2.1.6. Data analysis behavior within the Geometry Cognitive Tutor. Overall, the
Data from the log files was used to calculate a help-seeking mean help-seeking error rate of students in the Help condition
error rate for each student (i.e., what proportion of a student’s was significantly lower than that of the Control condition (14%
actions were categorized as help-seeking errors by the model). vs. 16% accordingly), t(64) ¼ 2.0, p ¼ .05, Cohen’s d ¼ 0.50
The error rate for specific actions (e.g., faulty hint requests) is (see Table 2). The Help Tutor had a different effect on different
reported out of the overall instances of that action (e.g., out of actions. Specifically, when asking for hints, students working
all hint requests) and not out of all actions. The mean help- with the Help Tutor made significantly fewer help-seeking
seeking error rate for each condition was computed by aver- errors. In fact, 26% of students’ hint requests in the Help
aging the students’ individual error rates. condition were categorized as faulty, compared to 36% in the
Performance on the different measures was analyzed Control condition, t(57) ¼ 5.7, p < .001, Cohen’s d ¼ 1.51. The
using a two-tailed equal-variance t-test. Comparisons between Help Tutor also had an effect on the level of hints students
different types of measures were done using partial cor- asked to see. Students in the Control group asked for the
relations. An alpha level of .05 was used in all statistical bottom-out hint on 70% of their hint sequences, while students
tests. in the Help condition asked for the bottom-out hint on only on
48% of their hint sequences, t(64) ¼ 4.3, p < .001, Cohen’s
2.2. Results d ¼ 1.07.
Help-seeking errors can also be made when trying a step,
Overall, students solved on average 128 problem steps per for example, when students attempt a step too quickly or
student (SD ¼ 61). Students performed on average 685 actions make an attempt when the model determines they should seek
per student (SD ¼ 962), including correct and incorrect help. There was no significant difference between the
entries, hint requests, and glossary access. conditions in the rate of metacognitive errors related to
solution attempts, t(58) ¼ .53, ns. There was also no
2.2.1. Validation of the help-seeking model
The first aim of Study 1 was to evaluate whether actions Table 2
assessed as help-seeking errors by the help-seeking model are Multiple measures from Study 1 as a function of group. Top: Students’ help-
associated with other forms of poor help-seeking behavior. seeking error rates (%) and effect sizes (Cohen’s d ) while working with the
Geometry Cognitive Tutor. Bottom: Students’ posttest scores on items with
The difference between students’ performance on posttest and without embedded hints, and scores on the hypothetical help-seeking
items with no hints and items with hints offer a direct measure dilemmas paper assessment.
of how well students use hints in the paper tests. The online Help group Control group Cohen’s d
help-seeking error rate (that is, the proportion of actions
Help-seeking error rate
labeled help-seeking errors) correlated with performance on Overall help-seeking error rate 14%* 16% 0.50
items with hints on the paper posttest, controlling for perfor- Hint 26%** 36% 1.50
mance on items with no hints on the same test (aggregated Try-step 8% 9%
across conditions; partial r ¼ .50, p < .01). Glossary 22% 23%
Students’ online help-seeking error rate also correlated sig- % of hint sequences to 48%** 70% 1.10
reach bottom-out hint
nificantly with their performance on the hypothetical dilemmas Help-seeking assessments during posttest
assessment. A median split analysis showed that students who Posttest items with hints .41 .40
made fewer help-seeking errors in the tutor scored significantly Posttest items with no hints .35 .31
higher on the hypothetical dilemmas assessment (77% vs. 59% Hypothetical help-seeking dilemmas .74 .64
respectively), t(28) ¼ 2.2, p ¼ .04, Cohen’s d ¼ 0.83. *p < .05; **p < .01.
8. 274 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
significant difference between the conditions on the rate of pre to posttest, there was no effect of the Help Tutor on
faulty glossary searches, t(49) ¼ .27, ns. students’ learning gains (for complete results see Roll et al,
2006). The lack of effect on domain knowledge raises an
2.2.3. Effect of the help tutor on other measures of help important question, namely why the improved help-seeking
seeing behavior did not lead to improved learning at the domain level.
There was no significant difference between conditions One possible explanation suggests that the Help Tutor
with regard to students’ performance on the hypothetical help- messages were interpreted in the local context in which they
seeking dilemmas, F(1, 28) ¼ 1.3, ns. Nor was there a differ- were given. Students may have treated them as an additional
ence between conditions with regard to performance on items task, and while they followed their recommendations, they did
with embedded hints, F(1, 28) ¼ 0, ns. not engage in deliberate help-seeking. This behavior is
sensible, given that domain-level hints are often specific to the
2.3. Discussion problem step at hand. This explanation emphasizes a limita-
tion of Study 1, that is, that knowledge of help seeking was not
2.3.1. Validation of the help-seeking model pitched as a valid learning goal, alongside knowledge of
Study 1 found that online help-seeking behavior according Geometry. In fact, students in the study received no explicit
to the help-seeking model correlated with students’ perfor- instruction on help seeking, and no connection was made
mance on posttest items with hints, controlling for perfor- between help-seeking behavior and success in learning the
mance on items with no hints. In other words, students who domain. To make this connection more explicit, Study 2
applied better help-seeking skills while working with the tutor included additional help-seeking support in the form of direct
(according to the help-seeking model) were better able to use instruction on help seeking. In addition, Study 1 did not stress
the hints embedded in the paper test when controlling for the general nature of help-seeking skills. To help students
domain knowledge (i.e., controlling for performance on items realize the wide applicability of these skills, Study 2 included
with no hints). Thus, the metacognitive behavior that was help-seeking support across multiple instructional units.
captured by the help-seeking model was consistent across An alternative explanation for the lack of effect of the Help
environments. Furthermore, it was found that behavior Tutor on domain-learning gains suggests that the meta-
according to the help-seeking model correlated with students’ cognitive feedback imposed excessive cognitive load that
ability to identify desired help-seeking behavior in the hypo- interfered with learning the domain knowledge. To reduce
thetical dilemmas assessment. The significant correlation cognitive load that is associated with help seeking, Study 2
between the help-seeking model and two independent included an environment in which students could engage
measures of help seeking on paper confirms the construct in self-assessment and help-seeking behavior outside the
validity of the help-seeking model, in support of Hypothesis 1. Geometry Cognitive Tutor. During these self-assessment
episodes students identified the domain-level skills on which
2.3.2. The effect of the help tutor they need assistance, thus bridging between the domain-
The results of Study 1 also suggest that students in the Help independent help-seeking principles and the relevant domain-
condition improved several aspects of their help-seeking specific knowledge.
behavior, in support of Hypothesis 2. Mainly, the Help Tutor An additional limitation of Study 1 has to do with the
led students to limit the use of executive help (that is, to ask student population that was studied. In the absence of posttest
for fewer bottom-out hints). At the same time, the frequency of results from the lower-achieving school, the study used only
students’ help avoidance was not reduced, compared to the data from the relatively high-achieving school to validate the
control condition. Perhaps the Help Tutor was a victim of the help-seeking model (Research Question 1). However, given
same behavior it attempts to cure, that is, students who tend to the previously reported high correlation between the help-
avoid using the tutor’s help at the domain level also ignored seeking model and domain-level learning gains, it is likely that
the Help Tutor’s recommendations. the validity of the Help Tutor is not limited to high performing
Notably, the Help Tutor did not force students into better students. Note that data from both schools was used to eval-
help-seeking behavior. For example, a student who is deter- uate the effect of the Help Tutor on students’ help-seeking
mined to view the bottom-out hint is able to do so even if the behavior (Research Question 2).
system recommends against it. Therefore, the effect of the
Help Tutor on the more common types of errors suggests that 3. Study 2: Evaluating the transferability of the improved
metacognitive feedback can be used to improve students’ help-seeking skills
behavior in tutoring systems. Yet, the present study did not
provide evidence that the Help Tutor helped students learn Study 2 evaluated Hypothesis 3, namely, that students’
transferable help-seeking skills. Study 2 focused on replicating improved help-seeking behavior will transfer to new content
the effect of the Help Tutor and assessing the transferability of within the Geometry Cognitive Tutor. Hypothesis 3 also pre-
the improved help-seeking patterns. dicted that no transfer would be found to paper-and-pencil
As mentioned earlier, Study 1 also measured learning at the measures of help-seeking knowledge. As described above, few
domain level. Results of domain-level assessments show that changes were made to the help-seeking support offered to
while students in both conditions improved significantly from students. First, a brief in class instruction on help seeking was
9. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 275
given to students in the Help condition. The instruction was devoted to review of the previous eleven units in preparation
included a 4-min video presentation with annotated examples for statewide exams. The topic of Month 3 was Quadrilaterals.
of productive and faulty help-seeking behavior in the Geom- During Month 4 students worked on different units of the tutor
etry Cognitive Tutor. The goals of the instruction were (mostly triangles and 3-D geometry). As in Study 1, Study 2
threefold: to give students a better declarative understanding compared two conditions: Students in the Help condition
of desired help-seeking behavior, to improve students’ dispo- received support for their help-seeking behavior, and students in
sitions towards seeking help, and to frame help-seeking the Control condition worked with the unmodified Geometry
knowledge as an important learning goal. Cognitive Tutor as they would have without the study. Unlike
In addition to the Help Tutor and the help-seeking Study 1, students in the Help condition had the support “turned
instruction, students in the Help condition worked with off” for part of the study. Help condition students received
the Self-Assessment Tutor. The Self-Assessment Tutor was support only during Months 1 and 3. During Months 2 and 4 they
developed with multiple goals in mind: to make the connection too worked with the unmodified Geometry Cognitive Tutor,
between adaptive help-seeking and domain-learning salient, similar to students in the Control condition. Comparing the help-
to help students get in the habit of self-assessing their ability, seeking behavior of students from both conditions during
to help students evaluate their knowledge level on the target Months 2 and 4 can be used to evaluate the transferability of
set of skills, and to offer students opportunities to practice the acquired help-seeking skills across domain-level topics
their help-seeking skills outside the context of the Geometry within the same environment. Furthermore, comparing students’
Cognitive Tutor. The Self-Assessment Tutor capitalized on the help-seeking behavior following Month 2 to that following
idea that correctly assessing one’s own ability was found to Month 4 can be used to evaluate dosage effect, that is, whether
correlate with strategic use of help (Tobias & Everson, 2002). receiving help-seeking support for two instructional units
The Self-Assessment Tutor makes the self-assessment process (during Months 1 and 3) yields different transfer results com-
explicit and scaffolds it, similarly to the learning process with pared with receiving support during a single instructional unit
the Metacognitive Instruction using a Reflection Approach (during Month 1 alone).
system (MIRA; Gama, 2004). In the Self-Assessment Tutor
the student first assesses her ability to solve a prototypical 3.1.2. Participants
problem from the subsequent Geometry Cognitive Tutor Study 2 took place in a rural vocational high school in
section. She then attempts to solve the problem (on-demand Western Pennsylvania (3% minorities, 25% mathematical
domain-level hints are made available at this stage). Finally proficiency). Participants were 67 students, 10th and 11th
the student reflects upon her prior assessment and ability (see graders (15e17 years old), who were enrolled in a Geometry
Fig. 6). Students used the Self-Assessment Tutor for about Cognitive Tutor course and were accustomed to the tutor’s
5 min prior to each Geometry Cognitive Tutor section (each of interface. While no exact gender information is available,
the two units in this study had four sections). enrollment in these classes was 69% male to 31% female.
The third and last change from Study 1 to Study 2 was that Since the study included teacher-led classroom instruction,
the Help Tutor in Study 2 was used in two instructional units. assignment to conditions was done at the class level. Two
Between the two units, as well as after the second unit, students classes (with 29 students) were assigned to the Help condition
from both conditions used the unmodified version of the and two classes (38 students) were assigned to the Control
Geometry Cognitive Tutor (without the Help Tutor support). condition. Two teachers taught the classes so that each teacher
The Help Tutor support was extended to two instructional units taught one class in each condition. Classes were assigned in
(and months) to see whether a “double dosage” of help-seeking consultation with the teachers, attempting to control for
support would improve transfer, especially that students may number of students and time of day. One class that was
identify similar patterns of desired help-seeking behavior reported by the teachers to be of lower ability was assigned to
across different domain-level units. the Help condition.
3.1. Method 3.1.3. Materials
The help-seeking support offered to students in the Help
3.1.1. Design condition during Months 1 and 3 included (a) a classroom
Study 2 spanned a period of four months and four instructional instruction on help seeking, (b) the Self-Assessment Tutor, and
units. During Month 1 students learned about Angles. Month 2 (c) the Help Tutor.
Fig. 6. The Self-Assessment Tutor.
10. 276 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
3.1.4. Instruments 3.1.6. Statistical analysis
Students’ behavior while receiving support (during Months Analysis of the data was done using similar statistical
1 and 3) was assessed using the help-seeking model. Students’ procedures to the ones described in Study 1. To allow for easy
help-seeking behavior once support from the Help Tutor was comparisons between different instructional units, Cohen’s
no longer available (during Months 2 and 4) was assessed d effect sizes are presented, in addition to raw values. An alpha
using log files from students’ interaction with the unmodified level of .05 was used in all statistical tests.
Geometry Cognitive Tutor. This analysis focused on two key
aspects of help-seeking behavior in the Geometry Cognitive
Tutor. The first is the average hint level. A high average hint 3.2. Results
level suggests that students often ask for bottom-out hints,
implying that they rely on more executive, rather than 3.2.1. Help-seeking behavior within the tutor
instrumental, help. Hence, one would expect the Help Tutor to While using the Geometry Cognitive Tutor, students in the
reduce the average hint level students ask to see. The second Help condition committed significantly fewer help-seeking
aspect in the analysis is time to take action. Rapid actions errors on hints, compared to their counterparts in the Control
usually suggest guessing or clicking through hints without condition, that is, 13% vs. 20% in Month 1, t(52) ¼ 4.3, p < .001,
reading their content (Aleven et al., 2006). Hence, one would Cohen’s d ¼ 1.2, and 12% vs. 19% in Month 3, t(56) ¼ 5.5,
expect the Help Tutor to encourage students to act more p < .001, Cohen’s d ¼ 1.47 (see Table 3). The Help Tutor had no
deliberately. effect on students’ help-seeking error rate during solution
In addition, following the intervention, students completed attempts (e.g., help avoidance), and it increased students’ rate
a hypothetical help-seeking dilemmas assessment, identical to of inappropriate glossary searches in Month 1, although
the one used in Study 1. Detailed descriptions of the domain- this difference was only marginally significant (1% vs. 2%),
level assessments and results are provided elsewhere (Roll, t(57) ¼ 1.9, p ¼ .06, Cohen’s d ¼ 0.50. Due to a technical error,
Aleven, McLaren, & Koedinger, 2007a). no glossary searches were logged in Month 3.
When asking for hints, Help students did not drill down as
much as Control students. Students in the Help condition
3.1.5. Procedure asked to see the bottom-out hint on only 45% of their hint
As described above, Study 2 spanned a period of four sequences, compared to 60% of the Control students in Month
months. Students in the Help condition received support 1, t(52) ¼ 2.3, p < .03, Cohen’s d ¼ 0.64, and on 41%
during Months 1 and 3, but not 2 and 4. The support included compared to 53% of the Control students during Month 3,
4 min of help-seeking instruction at the beginning of each unit. t(56) ¼ 2.2, p ¼ .03, Cohen’s d ¼ 0.59. Furthermore, the Help
However, due to the low quality of the video equipment in the Tutor significantly reduced the mean hint level in both months,
school, students could not notice the details of the instruction. that is, 3.1 vs. 3.7 in Month 1, t(41) ¼ 3.5, p < .001, Cohen’s
Following the instruction students in the Help condition d ¼ 1.10, and 3.3 vs. 3.9 in Month 3, t(45) ¼ 4.4, p < .001,
completed interleaved Geometry Cognitive Tutor þ Help Cohen’s d ¼ 1.31.
Tutor and Self-Assessment Tutor sessions, with the self- With respect to the time taken on the different types of
assessment sessions taking about 10% of the students’ time. actions, students in the Help condition did not take more time
Each unit (during Months 1 and 3) included four sections, before asking for their first hint, but they took longer while
hence each help-seeking instruction was followed by four reading hints and before asking for additional ones compared
cycles of interleaving Geometry Cognitive Tutor and Self to the Control condition, that is, 6 vs. 5 s in Month 1,
Assessment Tutor sections (see Fig. 7). There was no differ- t(51) ¼ 3, p < .001, Cohen’s d ¼ 0.84, and 8 vs. 6 s in Month
ence between conditions during Months 2 and 4. Students in 3, t(56) ¼ 2.2, p < .04, Cohen’s d ¼ 0.59. Students in the
both conditions worked with their respective versions of the Help condition took significantly more time also before
Geometry Cognitive Tutor for two days a week. The remain- making attempts compared to the students in the Control
ing three days were used by the teacher to cover the same condition, that is, 16 vs. 14 s in Month 1, t(65) ¼ 2.5,
material in class, as is commonly done in Geometry Cognitive p < .02, Cohen’s d ¼ 0.62, and 22 vs. 19 s in Month 3, t
Tutor courses. (62) ¼ 2.7, p < .01, Cohen’s d ¼ 0.69.
Fig. 7. Procedure of study 2. e Declarative help-seeking instruction; e Self-assessment episode. Table is not to scale, that is, self-assessment and
instructional activities took about 10% of the class time.
11. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 277
Table 3
Effect sizes (Cohen’s d ) of differences between the Help and the Control groups in Study 2 as a function of month of tutoring.
Month 1: (Help group Month 2: (Both groups Month 3: (Help group Month 4: (Both groups
received worked with the received help-seeking worked with the
help-seeking support) unmodified support) unmodified Cognitive
Cognitive Tutor) Tutor)
Domain-level topic Angles Different units Quadrilaterals Different units
Measures using the help-seeking model
Overall help-seeking error rate 0.1 (7% vs. 8%) N/A 0.4 (13% vs. 14%) N/A
Hint 1.2** (13% vs. 20%) N/A 1.4** (12% vs. 19%) N/A
Try-step 0.3 (7% vs. 8%) N/A 0.0 (13% vs. 13%) N/A
Glossary 0.5 (2% vs. 1%) N/A N/A N/A
% of drilling down to bottom-out hint 0.6* (45% vs. 60%) N/A 0.6* (47% vs. 59%) N/A
Depth of requested hints
Average hint level 1.1** (3.1 vs. 3.7) 0.1 (2.2 vs. 2.1) 1.3** (3.3 vs. 3.9) 0.5* (2.2 vs. 2.6)
Time before action
Ask for first hint 0.2 (18 vs. 16 s) 0.3 (24 vs. 22 s) 0.2 (24 vs. 22 s) 0.2 (24 vs. 22 s)
Ask for additional hint 0.7* (6 vs. 5 s) 0.3 (11 vs. 10 s) 0.6* (8 vs. 6 s) 0.9** (12 vs. 8 s)
Try step 0.6* (16 vs. 14 s) 0.4 (25 vs. 22 s) 0.7** (22 vs. 19 s) 0.3 (24 vs. 22 s)
N/A ¼ not available. *p < .05; **p < .01.
To support comparisons across units, values reported are effect sizes of the differences between conditions (Cohen’s d ). Numbers in parenthesis are raw values,
Help vs. Control condition.
3.2.2. Transfer of help-seeking skills assessment; specifically, .36 in the Help condition and .04 in
The columns labeled Month 2 and Month 4 in Table 3 show the Control condition, t(41) ¼ 2.4, p ¼ .02, Cohen’s d ¼ 0.75
data regarding the effect of the Help Tutor when students (to reduce guessing, wrong answers received a score of 1).
moved on to work on different units and the Help Tutor’s
feedback was no longer available (Research Question 3). 3.3. Discussion
Seven of the eight measures in these two columns were in the
desired direction with the Help Tutor students carrying Study 2 found that the help-seeking support led to an
forward patterns of improved help-seeking behavior into the improvement in students’ help-seeking behavior. While the
months with no help-seeking support. support was in effect, students used hints more deliberately
As described in subchapter 3.2.1, the Help Tutor reduced (i.e., took more time to request the next level, requested fewer
students’ average hint level while it was in effect, that is, hint levels, and fewer bottom-out hints). The support did
during Months 1 and 3. As shown in the Month 2 column of not lead to a reduction in help avoidance, and, in contrast to
Table 3, this effect did not carry over to Month 2, yet was Study 1, led to a marginally significant increase in the rate of
marginally significant in Month 4, that is, mean hint level was errors related to the glossary. However, the low frequency of
2.2 vs. 2.6, t(51) ¼ 1.9, p ¼ .06, Cohen’s d ¼ 0.50. these errors (1e2% of all glossary searches) makes them less
As described above, Help Tutor students took more time to important.
read hints before asking for additional ones while the Help The main finding of Study 2 is that students transferred
Tutor support was in effect. No difference in hint reading time their improved help-seeking skills to new units of the Geom-
was observed during Month 2, following the first use of the etry Cognitive Tutor where the help-seeking support was no
help tutor. However, a difference in hint reading time was longer in effect. The significant improvement in students’
found after the second unit with the Help Tutor, with students behavior in the unsupported environment was found only
in the Help condition taking more time than students in the during Month 4, and not during Month 2. This finding suggests
Control condition, that is, in Month 4 mean time was 12 vs. that students could extract the domain-independent nature of
8 s, t(44) ¼ 3.0, p < .01, Cohen’s d ¼ 0.90. help-seeking skills only after they received support for their
Another effect that was observed while working with the help-seeking behavior across two different instructional units
Help Tutor was that students in the Help condition took (angles and quadrilaterals) and for a longer duration. Also,
longer than students in the Control condition before unlike Study 1, Study 2 found that students in the Help
attempting a solution. This effect did not persist in the condition transferred their improved help-seeking skills to
absence of the Help Tutor during Months 2 and 4. The time the hypothetical help-seeking dilemmas assessment. Multiple
students took before asking for their first hint was not explanations can account for the transfer of help-seeking skills
affected by the Help Tutor, and naturally, no effect was that was observed in Study 2 (but not in Study 1). First, Study
observed in its absence. 2 included an instruction on desired help-seeking behavior,
Between Months 3 and 4 students completed the hypo- delivered via a short video. While the quality of the video was
thetical help-seeking dilemmas assessment. Students in the too low for students to notice details, a clear message
Help condition performed significantly better on this regarding the importance of help-seeking skills (and their
12. 278 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
relevance to the Geometry class) was conveyed. The Self- help at the domain level also do not pay attention to help
Assessment Tutor may have had a similar effect. Though the offered at the metacognitive level.
self-assessment episodes did not include explicit advice on The effect of the Help Tutor was not limited to the supported
how to seek help, they directed students’ attention to help environment. Study 2 found that most aspects of the improved
seeking, and helped them relate the general help-seeking help-seeking behavior transferred to new tasks even in the
principles to their specific knowledge gaps. Therefore, the absence of help-seeking support. These improvements consis-
transfer of the improved help-seeking skills may be attributed tently reached significance only after an extended period of time
to the interaction between the attention-focusing manipula- of working with the Help Tutor, across two instructional units. It
tions (help-seeking instruction and self-assessment episodes) was hypothesized that this effect is the outcome of the interac-
and the detailed feedback of the Help Tutor. tion between multiple factors. First, since the Help Tutor feed-
In addition to the described measures, Study 2 found no back appeared in two different areas of geometry learning,
effect of the help-seeking support on domain learning during students could more easily extract the domain-independent
Months 1 and 3 (see Roll et al., 2007a for detailed results). help-seeking skills and thus transfer them better to the other
Learning gains were not evaluated during Months 2 and 4. The subject-matter areas. Second, the manipulation directed
fact that improved help-seeking behavior did not lead to students’ attention to help-seeking in general (during the help-
improved learning requires an explanation. One explanation seeking instruction) and specifically to one’s own help-seeking
may be that the improvement in help-seeking behavior was not needs (during the self-assessment episodes), thus framing
sufficient to measurably impact domain-level learning. knowledge of help-seeking as a valuable learning goal alongside
However, one would expect to see at least a trend in the test knowledge of Geometry. The addition of the help-seeking
scores. Another possible explanation for the lack of effect on instruction and self-assessment episodes in Study 2 also led to
domain learning may be excessive cognitive load when a measurable transfer to a more conceptual task, that is, students’
receiving support on both the domain and the metacognitive ability to identify desired help-seeking strategies. The evidence
level. In this case, the Help Tutor may have led to improved for transferable help-seeking skills supports Hypothesis 3. Yet,
learning during the periods in which it was no longer in effect, Hypothesis 3 did not anticipate the more conceptual transfer to
when students were not subject to the cognitive load imposed the help-seeking dilemmas assessment.
by its messages. Domain-learning gains were not measured Alongside the improvement to students’ help-seeking
during Months 2 and 4, and thus this hypothesis cannot be skills, both studies did not show improvement to students’
evaluated with the available data. An alternative explanation is domain learning while receiving help-seeking support. One
detailed in the General discussion. explanation is that the Help Tutor, when used in the context of
domain learning, imposes excessive cognitive load. A different
4. General discussion explanation is that while students in the Help condition were
engaged in better learning behaviors, these do not necessarily
The Help Tutor uses the help-seeking model to identify suffice to yield better learning. The current focus on help
errors students make while working with the Geometry seeking may be missing critical elements. Specifically, it may
Cognitive Tutor. The help-seeking model was previously be that learning from hints is harder than perceived. For
shown to capture, in real-time, help-seeking errors that are example, Renkl (2002) showed that automated hints (which he
associated with poor learning (Aleven et al., 2006). Study 1 termed “instructional explanations”) might hinder students’
found that assessing students’ help-seeking behavior using the tendency to self-explain and thus learn. Reevaluating the
help-seeking model also correlated with other direct measures existing help-seeking literature suggests that learning from
of help-seeking behavior, namely, the use of hints embedded given hints may indeed be a rather complex process. Very few
in paper tests, and identifying correct help-seeking strategies experiments have manipulated help seeking in Intelligent
in hypothetical learning dilemmas. These findings support Tutoring Systems to date. Those who did often found that
Hypothesis 1 and demonstrate the usefulness of unobtrusive interventions that increase reflection yield better learning:
moment-by-moment assessment of help-seeking using Dutke and Reimer (2000) found that principle-based hints are
a detailed computational metacognitive model. Such assess- better than operative ones; Ringenberg and VanLehn (2006)
ment enables adaptation of support not only to students’ found that analogous solved examples may be better than
domain knowledge, but also to their metacognitive behavior. conventional hints; Schworm and Renkl (2002) found that
As for the effect of the metacognitive feedback, several deep reflection questions caused more learning compared with
aspects of students’ hint usage improved significantly while conventional hints; and Baker et al. (2006) showed that
working with the Help Tutor. Most notably, students asked for auxiliary exercises for students who misuse hints help them
fewer bottom-out hints, often associated with executive help learn better. These results suggest that in order to learn from
seeking (Aleven et al., 2006), and took more time to read the given help, applying the help should be followed by reflection
tutor’s domain-level hints. This improvement on the most on the process. Additional support to this hypothesis can be
common forms of help-seeking errors supports Hypothesis 2. found in Shih, Koedinger, and Scheines (2008) who showed
At the same time, no improvement in students’ attempts was that bottom-out hints were found most useful to students who
found (for example, rate of unskilled students guessing spontaneously self-explained following the hint. In the
rapidly). It may be that students who tend to avoid seeking absence of reflection, hints may not be very useful for
13. I. Roll et al. / Learning and Instruction 21 (2011) 267e280 279
learning. Indeed, Baker et al. (2006) found that reducing help Aleven, V., McLaren, B., Roll, I., & Koedinger, K. (2006). Toward meta-
abuse (and other gaming behaviors) might not contribute to cognitive tutoring: a model of help seeking with a cognitive tutor. Inter-
national Journal of Artificial Intelligence and Education, 16(2), 101e128.
learning gains by itself, a similar result to the findings pre- Aleven, V., Stahl, E., Schworm, S., Fischer, F., & Wallace, R. M. (2003). Help
sented in this study. While the work described in this article seeking and help design in interactive learning environments. Review of
focused on help seeking, it may lack a complementary support Educational Research, 73(2), 277e320.
for reflection on the application of the help. Anderson, J. R. (1993). Rules of the mind. Mahwah, NJ: Erlbaum.
While Studies 1 and 2 demonstrated the benefits of auto- Arbreton, A. (1998). Student goal orientation and help-seeking strategy use. In
S. A. Karabenick (Ed.), Strategic help seeking: Implications for learning
mated metacognitive feedback, they have several limitations. and teaching (pp. 95e116). Mahwah, NJ: Erlbaum.
First, Study 2 confounded several manipulations (the Help Baker, R. S. J.d., Corbett, A. T., Koedinger, K. R., Evenson, E., Roll, I.,
Tutor, the help-seeking instruction, and the self-assessment Wagner, A. Z., Naim, M., Raspat, J., Baker, D. J., & Beck, J. (2006).
episodes). As a result, the relative role of each manipulation Adapting to when students game an intelligent tutoring system. In M.
cannot be determined. In addition, the studies evaluated help- Ikeda, K. D. Ashley, & T. W. Chan (Eds.), Proceedings of the 8th inter-
national conference on intelligent tutoring systems (pp. 392e401). Berlin:
seeking support within a specific type of tutoring system, that Springer.
is, coached problem-solving environments. Study 1 found that Baker, R. S. J.d., Corbett, A. T., Roll, I., & Koedinger, K. R. (2008). Devel-
students’ help seeking within the environment correlates with oping a generalizable detector of when students game the system. User
help-seeking behavior in other contexts. Yet, it is not clear Modeling and User-Adapted Interaction, 18(3), 287e314.
whether assessing help-seeking behavior and giving appro- Dutke, S., & Reimer, T. (2000). Evaluation of two types of online help for
application software. Journal of Computer Assisted Learning, 16(4),
priate feedback in more open-ended tasks is doable and useful. 307e315.
While Study 2 found transfer to new domain knowledge, the Corbett, A. T., & Anderson, J. R. (2001). Locus of feedback control in
Help Tutor is yet to pass the ultimate test of a far transfer of computer-based tutoring: impact on learning rate, achievement and atti-
the help-seeking skills to different learning environments. tudes. In J. Jacko, A. Sears, M. Beaudouin-Lafon, & R. Jacob (Eds.),
To conclude, the two studies demonstrate that detailed Proceedings of CHI’2001 conference on human factors in computing
systems (pp. 245e252). New York: ACM.
metacognitive models of help seeking can be used by Gama, C. (2004). Metacognition in interactive learning environments: the
a tutoring system to assess students’ moment-to-moment reflection assistant model. In J. C. Lester, R. M. Vicario, & F. Paraguaçu
learning behavior unobtrusively. Furthermore, metacognitive (Eds.), Proceedings 7th international conference on intelligent tutoring
feedback, driven by a model of help seeking, can help students systems (pp. 668e677). Berlin: Springer.
improve their behavior in the tutoring system. Last, compre- Karabenick, S. A., & Knapp, J. R. (1991). Relationship of academic help
seeking to the use of learning strategies and other instrumental
hensive help-seeking support was shown not only to improve achievement behavior in college students. Journal of Educational
students’ help-seeking behavior while supported, but also to Psychology, 83(2), 221e230.
help students acquire better help-seeking skills in a meaning- Karabenick, S. A., & Newman, R. S. (2009). Seeking help: generalizable self-
ful way that transfers across topics within the same tutoring regulatory process and social-cultural barometer. In M. Wosnitza, S. A.
environment. This support combined immediate feedback on Karabenick, A. Efklides, & P. Nenniger (Eds.), Contemporary motivation
research: From global to local perspectives (pp. 25e48). Goettingen,
help-seeking errors, general help-seeking instruction, and self- Germany: Hogrefe & Huber.
assessment episodes that bridged between the general prin- Koedinger, K. R., & Corbett, A. T. (2006). Cognitive tutors: technology
ciples and the detailed feedback. These results suggest that bringing learning science to the classroom. In K. Sawyer (Ed.), The
educational technologies can be used to achieve robust Cambridge handbook of the learning sciences (pp. 61e78). New York:
improvement in students’ general learning skills. At the same Cambridge University Press.
Luckin, R., & du Boulay, B. (1999). Ecolab: the development and evaluation
time, they demonstrate that measurable improvement to of a Vygotskian design framework. International Journal of Artificial
students’ learning behavior does not necessarily translate to Intelligence in Education, 10(2), 198e220.
immediate benefits at the domain level. MacLeod, W. B., Butler, D. L., & Syer, K. D. (1996, April). Beyond achievement
data: Assessing changes in metacognition and strategic learning.
Paper presented at the Annual Meeting of the American Educational Research
Acknowledgements Association, New York. http://ecps.educ.ubc.ca/faculty/Butler/Confer/AERA
%25201996%2520metacognition.pdf, Retrieved on (13 August, 2010).
We thank Ryan Baker, Jo Bodnar, Ido Jamar, Terri Murphy, Mathews, M., & Mitrovic, T. (2008). How does students’ help-seeking
behaviour affect learning? In E. Aimeur & B. P. Woolf (Eds.),
Sabine Lynn, Kris Hobaugh, Kathy Dickensheets, Grant Proceedings of the 9th international conference on intelligent tutoring
McKinney, EJ Ryu, and Christy McGuire for their help. This systems (pp. 363e372). Berlin: Springer.
work was supported by the Pittsburgh Science of Learning Mercier, J., & Frederiksen, C. H. (2007). Individual differences in graduate
Center, which is supported by the National Science Founda- students’ help-seeking process in using a computer coach in problem-
tion (#SBE-0354420), by a Graduate Training Grant awarded based learning. Learning and Instruction, 17(2), 180e203.
Nelson-Le Gall, S. (1981). Help-seeking: an understudied problem-solving
by the Department of Education (#R305B040063), and by skill in children. Developmental Review, 1(3), 224e246.
a grant from the National Science Foundation (#IIS-0308200). Newman, R. S. (1994). Academic help seeking: a strategy of self-regu-
lated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-
regulation of learning and performance: Issues and educational
References applications (pp. 293e301). Hillsdale, NJ: Erlbaum.
Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In
Al-Hilawani, Y. (2003). Measuring students’ metacognition in real-life situa- M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-
tions. American Annals of the Deaf, 148(3), 233e242. regulation (pp. 451e502). San Diego, CA: Academic.
14. 280 I. Roll et al. / Learning and Instruction 21 (2011) 267e280
Renkl, A. (2002). Worked-out examples: instructional explanations support Shih, B., Koedinger, K. R., & Scheines, R. (2008). A response time model
learning by self-explanations. Learning and Instruction, 12(5), 529e556. for bottom-out hints as worked examples. In R. S. J.d Baker, T. Barnes,
Ringenberg, M. A., & VanLehn, K. (2006). Scaffolding problem solving with & J. E. Beck (Eds.), Proceedings of the 1st international conference on
annotated, worked-out examples to promote deep learning. In M. Ikeda, educational data mining (pp. 117e126). Montreal: Canada.
K. D. Ashley, & T. W. Chan (Eds.), Proceedings of the 8th international Shute, V. J., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery
conference on intelligent tutoring systems (pp. 624e634). Berlin: Springer. world: Smithtown. Interactive Learning Environments, 1(1), 51e77.
Roll, I., Aleven, V., McLaren, B. M., & Koedinger, K. R. (2007a). Can help Schworm, S., & Renkl, A. (2002). Learning by solved example problems:
seeking be tutored? Searching for the secret sauce of metacognitive instructional explanations reduce self-explanation activity. In W. D. Gray,
tutoring. In R. Luckin, K. R. Koedinger, & J. Greer (Eds.), Proceedings of & C. D. Schunn (Eds.), Proceedings of the 24th annual conference of the
the 13th international conference on artificial intelligence in education cognitive science society (pp. 816e821). Mahwah, NJ: Erlbaum.
(pp. 203e210). Amsterdam: IOS. Tobias, S., & Everson, H. (2002). Knowing what you know and what you
Roll, I., Aleven, V., McLaren, B. M., & Koedinger, K. R. (2007b). Designing don’t: Further research on metacognitive knowledge monitoring. New
for metacognition e applying cognitive tutor principles to the tutoring of York: College Board.
help seeking. Metacognition and Learning, 2(2), 125e140. VanLehn, K. (2006). The behavior of tutoring systems. International Journal
Roll, I., Aleven, V., McLaren, B. M., Ryu, E., Baker, R. S., & Koedinger, K. of Artificial Intelligence in Education, 16(3), 227e265.
R. (2006). The help tutor: does metacognitive feedback improve students’ Wagster, J., Tan, J., Wu, Y., Biswas, G., & Schwartz, D. L. (2007). Do learning
help-seeking actions, skills and learning? In M. Ikeda, K. D. Ashley, & by teaching environments with metacognitive support help students
T. W. Chan (Eds.), Proceedings of the 8th international conference on develop better learning behaviors? In D. S. McNamara & J. G. Trafton
intelligent tutoring systems (pp. 360e369). Berlin: Springer. (Eds.), Proceedings of the 29th annual meeting of the cognitive science
Ryan, A. M., & Shin, H. (2011). Help-seeking tendencies during early society (pp. 695e700). Austin, TX: Cognitive Science Society.
adolescence: An examination of motivational correlates and consequences Wood, H. A., & Wood, D. J. (1999). Help seeking, learning, and contingent
for achievement. Learning and Instruction, 21(2), 247e256. tutoring. Computers and Education, 33(2), 153e169.