Description
Lots of previous studies have analyzed human tutoring at great depths and have shown expert human tutors to produce effect sizes, which is twice of that produced by an intelligent tutoring system (ITS). However, there has been no consensus on

Lots of previous studies have analyzed human tutoring at great depths and have shown expert human tutors to produce effect sizes, which is twice of that produced by an intelligent tutoring system (ITS). However, there has been no consensus on which factor makes them so effective. It is important to know this, so that same phenomena can be replicated in an ITS in order to achieve the same level of proficiency as expert human tutors. Also, to the best of my knowledge no one has looked at student reactions when they are working with a computer based tutor. The answers to both these questions are needed in order to build a highly effective computer-based tutor. My research focuses on the second question. In the first phase of my thesis, I analyzed the behavior of students when they were working with a step-based tutor Andes, using verbal-protocol analysis. The accomplishment of doing this was that I got to know of some ways in which students use a step-based tutor which can pave way for the creation of more effective computer-based tutors. I found from the first phase of the research that students often keep trying to fix errors by guessing repeatedly instead of asking for help by clicking the hint button. This phenomenon is known as hint refusal. Surprisingly, a large portion of the student's foundering was due to hint refusal. The hypothesis tested in the second phase of the research is that hint refusal can be significantly reduced and learning can be significantly increased if Andes uses more unsolicited hints and meta hints. An unsolicited hint is a hint that is given without the student asking for one. A meta-hint is like an unsolicited hint in that it is given without the student asking for it, but it just prompts the student to click on the hint button. Two versions of Andes were compared: the original version and a new version that gave more unsolicited and meta-hints. During a two-hour experiment, there were large, statistically reliable differences in several performance measures suggesting that the new policy was more effective.
Reuse Permissions
  • Downloads
    pdf (2.1 MB)

    Details

    Title
    • Analyzing student problem-solving behavior in a step-based tutor and understanding the effect of unsolicited hints
    Contributors
    Date Created
    2011
    Resource Type
  • Text
  • Collections this item is in
    Note
    • Partial requirement for: M.S., Arizona State University, 2011
      Note type
      thesis
    • Includes bibliographical references (p. 54-58)
      Note type
      bibliography
    • Field of study: Computer science

    Citation and reuse

    Statement of Responsibility

    by Rajagopalan Ranganathan

    Machine-readable links