Link to CSU home page

Information about this Journal

Call for Papers

Submission Guidelines

Events & Opportunities

List of Editorial Board members

List of Contributors

Link to ITL home page


Link to Exchanges home page

Web-based Mathematics Homework: A Case Study Using WeBWorK in College Algebra Classes

Angelo Segalla and Alan Safer

Department of Mathematics and Statistics
California State University, Long Beach

Print-Friendly

Abstract

In this paper the effect of WeBWorK, a web-based mathematics homework system, on student achievement in several sections of the class College Algebra is examined, comparing the gain from pretest to posttest of students who used WeBWorK to that of students who used traditional paper and pencil homework. We found that students who used WeBWorK performed as well as those who used the traditional method. We conclude that when WeBWorK is used as an integral part of a mathematics course it can 1) augment the teaching-learning process by providing the instructor with valuable information on students’ homework performance, information that ordinarily would not be available to the instructor, and 2) save the instructor a great deal of time and energy which can be redirected to other aspects of the course.

Keywords: Web-based homework, homework, mathematics, student achievement, college algebra, quizzes.

I. Introduction

WeBWorK is a system developed by Mike Gage and Arnie Pizer at the University of Rochester that allows students to do mathematics homework over the Internet; it is currently used in over 150 universities and community colleges, representing over 80,000 students. In this paper we give an overview of how WeBWorK operates and present the results of our research on its effectiveness in our setting.

At California State University, Long Beach (CSULB) we conducted a pilot study during the fall term of 2001, followed by a more formal three-semester study designed to measure the effect of WeBWorK used in college algebra classes. We explored its effect on student pretest-posttest scores in college algebra and, more specifically, its effect on some subpopulations. Our results generally concur with nascent research on WeBWorK [8], are consistent with studies done on other Internet homework systems [1, 2, 5, 10, 12, 14], and also agree with conclusions stated at several presentations in the last three years at national Mathematical Association of America (MAA) conferences by faculty who use WeBWorK in their mathematics classes [7, 8].

Other web-based programs, like WebAssign, WebCT, OWL, LON-CAPA, and UT Homework Service (See [9, 11, 17, 18, 19]) tend to be course management systems with homework as only one component, characteristically accepting only multiple-choice or fill-in answers. WeBWorK, in contrast, was developed specifically for accepting mathematics homework answers in mathematical notation [20]. This is an important consideration for choosing WeBWorK when considering a web-based homework system. In addition, WeBWorK has the largest freeware database of homework problems for lower division mathematics courses.

II. A Brief Description of WeBWorK

WeBWorK is a free, open-source, web-based software program designed specifically for mathematics homework [20]. Unlike computer-assisted instruction, WeBWorK is not designed to teach or tutor; it simply delivers, grades and records students’ homework, quizzes, and practice exams. WeBWorK databases contain thousands of problems organized by topic and subtopic, ranging from college algebra to multivariable calculus and differential equations. At CSULB we contributed hundreds of college algebra and precalculus problems based on problems from James Stewart’s textbooks [15, 16].

With WeBWorK the student enters answers directly on the computer, receiving immediate feedback on each problem. The system allows multiple tries, and reports detailed information to the instructor on each student’s performance. WeBWorK problems are individualized for each student by using a core problem to generate similar problems using randomized parameters. For example, the program uses the quadratic equation ax2 + bx+ c = 0 with different values for the parameters a, b, and c, to generate multiple problems. Thus, Albert may be asked to solve 6x2 – 13x + 6 = 0, whereas Bonnie gets 6x2 + 13x – 5 = 0.

Answers are entered using standard calculator notation. For example, if a question asks to completely factor x4 – 4x2 + 3, the student enters

(x – 1) (x + 1) (x^2 – 3)

The program then determines whether the answer is correct. The student may enter an answer in any one of many equivalent algebraic forms, and WeBWorK will still recognize the answer. This feature is one of the strengths of the program, giving it some similarity to an actual instructor.

How Instructors Use WeBWorK

Setting up a homework assignment with WeBWorK is similar to the traditional method of picking problems from a textbook. With WeBWorK an instructor chooses problems from a large national database in an easy “point and click” manner. An instructor may add a new problem to the database but would need some facility with coding in the Perl programming language [13]. As in the traditional setting, WeBWorK allows the instructor to choose due dates and amount of credit allowed for each assignment. On the other hand, there are options in WeBWorK that do not exist with traditional homework. For example, the instructor can limit the number of attempts a student can make to solve each problem. Also, since the program assigns different problems to each student, the potential for students copying answers from each other is eliminated. Consequently, students may be encouraged to work cooperatively, yet each student must submit his or her own answers.

A significant feature of WeBWorK is the immediate and detailed reports it provides to the instructor on student performance on a given homework assignment. The instructor receives this information in the form of a spreadsheet. Troublesome homework problems can easily be identified—for example, particular problems with an unusually large number of attempts, or a low correct response rate, probably need to be explained further in class.

To enhance the WeBWorK experience, the instructor may set up the Help button to provide information, general guidelines, and hints for an assignment. In our study, instructors encouraged students to use the Feedback button to ask questions, resulting in a large number of weekly emails fielded by the administrators.

How Students Use WeBWorK

From the student’s perspective, the process of using WeBWorK to do homework is straightforward. The student logs in, chooses the appropriate problem set, and is ready to work on the assignment. A student can access his or her unique homework assignment at any time and continue wherever he or she left off (individualized problems do not change from session to session).

Figure 1 displays a typical algebra problem, the student’s inputs, and WeBWorK feedback. Note that the second answer in the sample problem is incorrect, and that the student may submit another answer. WeBWorK would have accepted an answer like 1/3 –12 unless the instructor specified the simplest answer—maybe to check for order of operations. Students get immediate feedback via the Preview button, which verifies that student and computer agree on syntax (only the Entered and Answer Preview columns are displayed in Figure 1), while Check Answers tells the student whether the answers submitted are correct or incorrect (all three columns are displayed).

Figure 1. Sample algebra problem, student's answers, and WeBWorK's immediate feedback.

The number of times a student can resubmit answers is unlimited unless the instructor limits them when setting up the assignment. At any time during an assignment the student can obtain a record of his or her progress by pressing the Get Summary button.

In beginning-of-semester instructor training sessions and subsequent procedural memoranda and emails about WeBWorK, instructors were asked to encourage students to work cooperatively. However, we did not determine how or even whether students in different sections implemented cooperative work. This unknown is a possible confounding variable in the results, since cooperative work might affect the number of problems attempted, the quality of the answers, and the attitudes and motivations of the students. These differences could have affected performance on the posttests, possibly increasing the absolute gain in some sections relative to others.

III. Participants

At CSULB we used WeBWorK for four semesters in several lower division mathematics courses, comprising more than 60 course sections taught by 25 different instructors.

In this paper we consider the College Algebra course, representing by far the largest group of students. Table 1 shows the number of College Algebra sections by semester and the number of students that were in the study. In the discussion that follows, WBH refers to sections that used web-based homework, and PPH to sections using pencil-paper homework (traditional).

Table 1
College Algebra Classes in the WeBWorK Study
Experimental/Control Fall 2001* Spring 2002 Fall 2002** Spring 2003** Total
Number of experimental sections (students) using WeBWorK homework (WBH) 12 (374) 8 (301) 13 (458) 11 (365) 44 (1428)
Number of control sections (students) using traditional paper and pencil homework (PPH) 15 (496) 9 (310) 6 (219) 1 (55) 31 (1080)
Did not participate 2 (85) 9 (308) 12***(245) 6***(116) 29 (754)

* Fall 2001 was used as a pilot study.
** In Fall 2002 and Spring 2003 newly instituted large sections did not participate in the study.
*** Includes large section activity breakout sections as separate classes.

CSULB enrolls between 600 and 800 students each semester in College Algebra in sections of approximately 35 students. Table 2 displays some of the characteristics of all the college algebra students in this study. The “typical student” is a young first-year full-time female student who averaged nearly a 20% increase from pretest to posttest and earned a grade of A, B, or C in the course.

Table 2
CSULB College Algebra Distribution by Level, Gender, Ethnicity, Age, Semester Units Enrolled, WBH/PPH, and Mean ACT ScoreSpring 2002, Fall 2002, and Spring 2003, n = 1149
FreshmanSophomoreJuniorSeniorGraduateFemaleMale
71.40%18.20%4.30%4.60%0.30%70.50%29.50%
 
CaucasianAfrican-American LatinoAsianOtherAge <= 23Age > 23
34.20%7.50%26.80%21.90%9.60%93.00%7.00%
 
0-11
sem units
12-16
sem units
17+
sem units
WeBWorK (WBH)Traditional (PPH)Mean ACT scoreACT Standard Deviation
7.50%86.10%6.00%65.10%34.90%18.906.50

Instructor Participants

Characteristically, College Algebra instructors at CSULB can be divided into three major categories: 1) graduate teaching assistants (40%) working on their master’s degrees in mathematics, with little or no college teaching experience; 2) adjunct faculty (40%) with a master’s degree in mathematics and some experience teaching college algebra; and 3) full-time tenure-track professors (20%).

IV. Research Questions

To measure the effectiveness of WeBWorK in our College Algebra courses, we considered several questions:

  1. Do students who use WeBWorK perform better than or the same as those who do traditional homework?
  2. Do students do more homework with WeBWorK than with traditional homework?
  3. Are there student subgroups that perform better with WeBWorK?

Along with the pretest and posttest results, WeBWorK’s ability to record in detail each student’s homework performance provided the data needed to analyze the three questions.

V. Design and Methods

We performed a pilot study during the fall 2001 semester to guide our more formal studies during the next three semesters. All students and instructors were aware that they were participating in a study, and, although encouraged to take part, participation was voluntary. Indeed, some instructors and a small number of individual students chose not to participate. All classes used the same textbook [15].

Design of the Study

Each semester, cooperating College Algebra sections were assigned randomly to the experimental WeBWorK web-based homework group (WBH) or to the control traditional paper and pencil homework group (PPH). Where an instructor taught two course sections, we randomly placed one section in WBH and the other in PPH. Except for the spring 2002 semester, instructors who taught two sections were allowed to switch so that both classes were in the same group. In both WBH and PPH, approximately 35% who started the study did not complete the study; some withdrew and others did not take or did not complete the pretest or posttest.

The College Algebra course coordinator provided each PPH and WBH instructor with the same list of homework exercises for the entire semester. The project programming team coded that same list into WeBWorK to be used by WBH students. Instructors in both groups were required to use at least 80% of the problems on this list for their assignments during the semester, allowing additions and deletions to accommodate individual teaching styles. Only one or two WBH instructors chose to amend the homework list, whereas twice as many PPH instructors did so.

For this article we use gain from pretest to posttest, demographic information, and class grade.

Measuring Effectiveness of WeBWorK

Our basic measure of the effectiveness of WeBWorK was the gain (or absolute gain) in score from pretest to posttest.

ABSOLUTE GAIN = POSTTEST SCORE – PRETEST SCORE

At the beginning of each semester a 25-item multiple-choice pretest, designed by the authors with the assistance of the College Algebra course coordinator, was administered to all classes. The same test was used for the posttest at the end of the course. Five experienced college mathematics instructors established face and content validity for the exam.

For comparison, we also analyzed the data with a widely used statistic suggested by Hake [6]. It is defined as the gain divided by the maximum possible gain:

NORMALIZED GAIN =
POSTTEST SCORE - PRETEST SCORE

MAXIMUM SCORE - PRETEST SCORE

Normalized gain, unlike absolute gain, compensates for the initial knowledge of the test taker and provides us with additional analyses on the same data. Normalized-gain values can range between zero and one. A student or section with a high score on the pretest has much less “room” for absolute gain than a person or section characterized by a low pretest score, and the normalized gain attempts to account for that difference. Thus, even though we relied primarily on absolute gain in this study, we ran some parallel analyses with the normalized gain and obtained similar results with both methods, as described below.

VI. Results

Overarching Comparison of WeBWorK Versus Paper and Pencil Homework

Is WeBWorK a viable homework alternative to traditional paper and pencil homework? In other words, does WeBWorK make a difference? To measure differences in average absolute gain between experimental and control groups, analysis of variance (ANOVA), t-tests, and descriptive statistics were used to summarize independent variables. An analysis of variance on gain from the pretest to the posttest across all three semesters (comparing all students in WBH versus all students in PPH) indicated no statistically significant differences (p = 0.55). The mean gain from pretest to posttest for the WBH group was 4.35 (out of 25 questions), slightly higher than 4.15 for the PPH group, but not statistically significant. Thus it seems that students who do mathematics homework using WeBWorK perform as well as those who use the traditional paper and pencil method. This is consistent with studies done for other Internet homework systems [1, 2, 5, 7, 10, 12, 14]. Moreover, other research indicates that mathematics achievement of students who use WeBWorK is as high as, and for certain subpopulations higher than, that of students who use the traditional method [8].

Similar results are obtained if “mean normalized gain” from pretest to posttest is used as the measure of student improvement. The mean normalized gain was 0.29 for all students in WBH and 0.27 for those in PPH (with a standard deviation of 0.01 for each group). As was the case for the mean gain, the mean normalized gain for the WBH group was slightly higher than for the PPH group, again showing that WBH students performed at least as well as their PPH counterparts. Following Hake, we considered normalized gains in three categories: “high” for a normalized gain greater than 0.7, “medium” between 0.3 and 0.7, and “low” below 0.3. In both WBH and PPH conditions, the normalized gain for students in this study is in the “low” category as defined by Hake [6].

We refined our analysis by controlling for specific variables, considering certain student subpopulations, and exploring student and instructor attitudes toward WeBWorK.

Controlling for the Instructor

To consider student performance with WeBWorK after controlling for instructor effect, we looked at eight instructors, each of whom taught two sections of College Algebra, one with WeBWorK and one with traditional homework. Each class was given the same pretest and posttest. The absolute gain (posttest score - pretest score) was calculated for each student. The mean gain for each section together with the standard error (S.E.), the class size (n) and the associated p-value for the difference in means t-test is shown in Table 3.

Table 3
Mean Absolute Gain-WeBWorK vs. Traditional Homework
Instructor Mean Absolute Gain WeBWorK S.E. n Mean Absolute Gain Traditional S.E. N p-value
Alan5.48.93255.18.5028.78
Betty4.31.49264.86.6022.48
Cathy1.93.76152.36.6922.68
Dave4.17.72234.23.5930.95
Eric4.00.6592.50.6510.12
Florence5.31.54366.03.5931.37
Greg2.74.62193.83.5630.20
Haley4.21.89193.05.8220.39

The p-values indicate that any difference in the mean absolute gain of the students of each instructor, in a comparison of the instructor’s sections conducted with WBH and PPH, cannot reasonably be attributed to either homework condition. An analysis of variance, controlling for the instructor, shows no significant differences (p = 0.71) in mean absolute gain. However, there is a significant correlation (r = 0.74, p = 0.038) between WBH and PPH by instructor. This appears to mean that the instructor was a major factor in the value of a section’s average absolute gain under either homework condition.

Controlling for Grade Received in the Class

Some users of WeBWorK report anecdotal evidence suggesting that high achievers as well as low achievers do better with WeBWorK than with traditional homework. To test this theory against our data, we compared the mean gain of each student versus the grade received in the course. A strong caveat must be given, since all the instructors were using their best judgment to assign grades, which could differ a great deal from instructor to instructor. Since the instructor could be a major factor in determining a section’s average absolute gain, the instructor’s relative assignment of grades might affect the gain outcomes versus grades in unknown ways. However, as a rough measure of how top students from all sections performed, we decided to examine this statistic.

Table 4 shows the mean gain for the WeBWorK groups and the traditional groups for the academic year 2001-2002 by letter grade received in the course. The data for these semesters include only those students who completed both the pretest and posttest.

Table 4
Mean Absolute Gain by Letter Grade
Grade Received Mean Absolute Gain,
WeBWorK
S.E. N Mean Absolute Gain, Traditional S.E. n p-value
Fall 2001
A 7.1 0.59 39 3.6 0.50 81 0.00
B 7.1 0.48 41 3.4 0.47 99 0.00
C 3.7 0.57 31 3.4 0.48 101 0.79
D 6.1 0.39 18 2.5 0.40 24 0.01
F 1.8 0.58 8 1.5 0.35 6 0.91
Spring 2002
A 5.9 0.43 47 5.5 0.41 57 0.46
B 3.6 0.42 58 4.2 0.36 65 0.27
C 3.4 0.57 41 3.6 0.50 46 0.96
D 4.9 1.08 16 2.0 0.84 20 0.04
F 3.4 0.97 10 2.4 0.98 5 0.49

The table shows significantly higher mean absolute gains for the A and B students who used WeBWorK in fall 2001 compared to the traditional group; similarly, the D students showed larger gains in the WBH sections. Moreover, an analysis of variance verifies that there is a significant difference (p < 0 .05) in mean absolute gain for A, B, and D students between the two homework conditions. Interestingly, for C students there was no significant difference in mean gain between the two homework groups.

In the spring 2002 semester, only the D students showed a significantly higher gain by using WeBWorK (see Table 4). This difference may be due to the greater attention paid to training WeBWorK instructors during the first semester of the study. As expected, an analysis of variance did not show any significant difference in mean absolute gain between the two homework conditions by grade received in the course.

The exit interviews with the instructors made clear that there were many differences in the way they implemented WeBWorK in the two semesters. Despite the intriguing results from the first semester, the results from the two semesters taken together therefore do not allow any conclusion about differential effects of homework conditions on students at different performance-grade levels.

Homework Completion Rates

Some WeBWorK installations report that the program motivates students to do more homework than they would ordinarily do in the traditional method. The University of Rochester, for example, reports that more than 90% of its students complete homework assignments. In general, research on how much homework is done by typical college students is varied and often conflicting, yet 90% seems a high rate of homework completion for any setting. [3]

We were not logistically able to collect as much homework data for the PPH students as that automatically compiled by WeBWorK for the WBH students. With this limitation and an arbitrary “homework completion index” of 50%, Table 5 presents the findings.

Students who completed at least half of their homework assignments averaged a gain of 4.8 points out of 25, compared to 3.6 for those who did less than half. This is a statistically significant difference (p = 0.02) and agrees with the results in Hirsch and Weibel [8]. In this same group, 26.2% gained 8 or more points (out of 25) from pretest to posttest, a statistically significant difference (p = 0.04), while only 15.8% of students who did less than half gained 8 or more points.

Although this analysis does not help pinpoint particular classes in which the instructor integrated WeBWorK more fully into the course, it does show that the program encourages students to do more homework. Presenters at several national meetings of the American Mathematical Society and the Mathematical Association of America have established similar results for WeBWorK, reporting that “students worked until they got the right answers,” and that “many students worked collaboratively.” Further, those researchers are attempting to “understand the connection between WeBWorK performance and overall course performance,” “determine if information captured by system can identify at-risk students,” and “determine if patterns of errors can be identified” [21]. The authors of this paper are exploring similar questions, as well as that of using the Hake metric in our ongoing research on WeBWorK.

Table 5
WBH Student Completion Rates and
Gain from Pretest to Posttest
  Gain Intervals
Lowest* through 2 3 through 7 8 and above
N Row % N Row % N Row %
Student
homework
assignments
completed
Less than half 52 43.3% 49 40.8% 19 15.8%
More than half 46 32.6% 58 41.1% 37 26.2%

* Some gains actually negative. Data checked manually to assure accuracy.

In Table 5, if “more than half” is changed to “more than 55%,” then significant differences in gain are observed—68% for a gain of at least 3 points (versus 56.6% in the table) and 29% for a gain of at least 8 points (versus 15.8% in the table).

VII. Conclusions

When used for College Algebra classes, WeBWorK homework is as effective as traditional paper and pencil homework. The overarching advantage of WeBWorK, repeatedly stated by instructors, is its ability to collect, grade, and record homework, allowing the instructor to pinpoint troublesome homework exercises from the spreadsheet and to home in on concepts needing discussion in class. This ability saved the instructor considerable time to devote to other parts of the course. WeBWorK keeps track of student performance, so abundant data is available on the homework habits of WeBWorK students. To more fully assess the effectiveness of online homework, however, more complete comparative data will be needed on students that do traditional homework.

Acknowledgment

The authors wish to express their appreciation to the National Science Foundation for its generous support of this project (award 0311739).

References

  1. S. Bonham, R. Beichner, and D. Deardorff, Online homework: Does it make a difference? The Physics Teacher, 39 (2001), 293-296.
  2. S. Bonham, A. Titus, R. J. Beichner, and L. Martin, Education research using web-based systems. Journal of Research on Computing in Education, 33 (2000) 28-45.
  3. E. J. Cancio, R. P. West, and K. R. Young, Improving mathematics homework completion and accuracy of students with EBD through self-management and parent participation, Journal of Emotional and Behavioral Disorders, 12 (2004), 9-22.
  4. J. B. Ellsworth, Surviving Change: A Survey of Educational Change Models, ERIC Clearing House on Information and Technology, Syracuse University, Syracuse, N.Y, 2000.
  5. R. E. Flori, R. H. Hall, , N. Hubing, D. Oglesby, T. Philpot, and V. Yellamraju, Incorporating web-based homework problems in engineering dynamics, Proceedings of the American Society for Engineering Education, Annual Conference & Exposition, American Society for Engineering Education, 2002.
  6. R. Hake, Analyzing Change/Gain Scores [http://physics.indiana.edu/~sdi/AnalyzingChange-Gain.pdf], message originally posted on American Educational Research Association Division D electronic mailing list, 1999.
  7. S. Hauk, R. Powers, A. Safer, and A. Segalla, Impact of the web-based homework program WeBWorK on student performance in moderate enrollment college algebra courses [http://hopper.unco.edu/faculty/personal/hauk/segalla/WBWquan.pdf], paper presented at national meeting of the Mathematical Association of America (n.d.).
  8. L. Hirsch and C. Weibel, Statistical evidence that web-based homework helps. FOCUS: The Newsletter of the Mathematical Association of America, 23 (2003), 14.
  9. LON-CAPA, Learning on Line—Computer Assisted Personalized Approach [http://www.lon-capa.org/]. Web-based, open-source, freeware course management system developed at Michigan State University.
  10. J. Mestre, R. Dufrense, D. Hart, and K. Rath, The effect of web-based homework on test performance in large enrollment introductory physics courses, Journal of Computers in Mathematics and Science Teaching, 21 (2002), 229.
  11. OWL, Online Web-based Learning [http://owl1.thomsonlearning.com/]. OWL is published by Thompson Learning and developed at the University of Massachusetts.
  12. A. Pascarella, CAPA (Computer-Assisted Personalized Assignments) in a Large University Setting, doctoral dissertation, University of Colorado, Boulder, 2002. Dissertation Abstracts International, 63, 2872.
  13. Perl [http://www.perl.com/]. Perl is a text processing programming language.
  14. D. Pritchard, and E. Morote, Reliable Assessment with Cybertutor, a Web-based Homework Tutor, World Conference on E-learning in Corporate, Government, HealthCare, and Higher Education, E-Learn, Montreal, Canada, 2002.
  15. J. Stewart, L. Redlin, and S. Watson, College Algebra, 4th edition, Brooks/Cole, Pacific Grove, California, 2004.
  16. J. Stewart, L. Redlin, and S. Watson, Precalculus: Mathematics for Calculus, 4th edition. Brooks-Cole, Pacific Grove, California, 2002.
  17. UT Homework Service, University of Texas Homework Services [https://hw.utexas.edu/overview.html]. Web-based interactive freeware program developed at the University of Texas.
  18. WebAssign, WebAssign [http://www.webassign.net/]. Proprietary web-based homework service developed at North Carolina State University.
  19. WebCT, WebCT [http://webct.com/]. Online course delivery and management systems.
  20. WeBWorK, Web-based Homework [http://webwork.math.rochester.edu/docs/]. Original installation of WeBWorK at the University of Rochester.
  21. WeBWorK Minicourse at the AMS/MAA National Meeting, January 2003, Baltimore, Maryland.

Posted April 5, 2006.

All material appearing in this journal is subject to applicable copyright laws.
Publication in this journal in no way indicates the endorsement of the content by the California State University, the Institute for Teaching and Learning, or the Exchanges Editorial Board. ©2006 by Angelo Segalla and Alan Safer.

·· exchanges ·· reviews ·· top of this page ··


http://www.exchangesjournal.org | ITL home