How Much Work Should We Assign?

This post was originally posted to the Blog of Rice's Center for Teaching Excellence. The original post can be found here.


As I've worked with faculty and graduate students on their courses over the last few years, I've learned that there are a number of questions I will almost always be asked. And if I were to rank their frequency, "how much work should I be assigning?" ranks close to the top.

In asking this question, most are not seeking guidance about the number of out of class hours they can demand of students. Despite evidence that the average college student only spends 12-15 hours a week studying, there seems to be general agreement that the Carnegie Unit recommendation of two hours out of class for every credit hour, or 24-36 hours a week, is a perfectly reasonable expectation. Unlike K-12, where debates about the virtues of homework rage, most recognize that the structure of higher education is such that college students should be spending far more time on homework than they spend in class.

So what these instructors are really asking is "how much can I reasonably expect my students to accomplish within these time constraints?" This is a perfectly sensible question that all of us who teach should be asking every time we design an assignment. Yet, it turns out to be far more difficult to answer than one would hope.

In the past, I've answered by noting some combination of the following:

  1. Students will take much longer to complete a task than you will take to complete the same task. This is obvious upon reflection (after all, you are the expert and they are the novice), but you'd be surprised how often it is ignored in practice. A commonly shared rule of thumb is that you should expect your students to take three times longer than you on assignments and exams.
     
  2. It is impossible to estimate how long students will take to complete an assignment (whether it involves reading, writing, or studying) without getting into the details of the assignment. Reading children's literature is not the same as reading Kant and analytic writing is not the same as writing a personal narrative.
     
  3. A recent large-scale study has found that the more often students take courses with at least 40 pages of reading a week *and* 20 pages of writing in a semester, the greater their gains on the Collegiate Learning Assessment. So, all things being equal, these are good minimums.
     
  4. Because expectations vary by institution and discipline, you should survey the syllabi within your department and make sure you're not far outside the norm in either direction.

Yet, over time, I've become dissatisfied with these responses. They not only distract attention from conversations I'd rather be having about student learning, they fail to really answer the question.

#2 is the perfect example of a non-answer, and if it is true that different sorts of reading and writing demand different time commitments, the data in #3 is far too broad to be helpful (40 pages of what? Read how?). #4 is good advice for political reasons, but what if departmental norms expect too much or too little of students? #1 seems safest, but rules of thumb are especially dangerous; what evidence do we have that three is the appropriate multiplier? And how confident are we that we will always accurately estimate the time it will take us to complete the assignment? As someone who has recently begun tracking her own time, I know that our estimates can be wildly mistaken (and evidence suggests our students can't be trusted to estimate their own time use, either).

In short, I realized I had no real evidence that any of the advice I was giving would help instructors estimate the amount of time they were actually asking their students to spend each week. And as I learned that previous estimates of my own reading and writing rates were underestimating the time I was spending, I realized that I was probably asking more of students than I had previously assumed. Most importantly, I realized we would all be better served if we had an easy way to estimate out of class workload that was grounded in empirical research on how students work and learn.

With this goal in mind, I turned to the research. And after a quick survey of the literature on reading, I found myself taking out my calculator to determine how much I was expecting of my students (spoiler alert: the news wasn't good). It then occurred to me that the best way to help instructors was to make it easier for them to make similar calculations. So I recruited Justin Esarey to work with me on an app that would automatically estimate course workload based on instructor inputs about assignments, as well as the research on student learning. And we are introducing our finished version of the tool today (link below). 

As we note in the estimation details, the research is more limited than I had hoped it would be. We know quite a bit about reading under normal conditions, but not as much about reading under the specialized conditions of reading for a college course. And we know next to nothing about student writing rates. So while the estimator automatically calculates reading and writing rates based on our limited research, users are able to manually adjust those estimates to get a sense of how much they are expecting of students given their own assumptions. Why include this option? Because our beta testing revealed that some faculty were assigning more work than they thought, even given their own assumptions.

So what does it mean if, like both Justin and me, you turn out to be asking your students to do more than you assumed?

It seems to me you have at least four options. You can:

  1. Reduce the number and size of your assignments.  
     
  2. Keep your assignments, but change your expectations for the work that you assign. If in the past you've assigned 100 pages a week and expected close, analytic reading, you can continue to assign the same amount but recognize that your students will only be able to skim those pages for main ideas. Likewise, if you've previously assigned 30 pages of writing with the expectation that students engage in extensive drafting and revision, you can continue to ask for 30 pages but recognize that the quality of writing you receive will be less polished.
     
  3. Ask yourself whether there may be alternative assignments that achieve the same goals in less time. If it turns out your students will need 10 hours to prepare a high-quality research presentation, but they could learn the public speaking skills you really care about by spending two hours preparing and presenting four, one-minute stand and deliver presentations, changing your assignment may be your best option.
     
  4. Consider revising the goals of a particular unit or course via the process of backward design.

But even if you change nothing about your expectations, I hope that our estimator will at least help you to communicate those expectations to your students. Given the natural variation of student abilities and background knowledge, there will always be variation in the time it takes individual students to complete assignments. But making your targets explicit will give students a better sense of where they stand within the course, drawing their (and your) attention to inefficient study strategies that might be easily addressed. It will also help our best students (who have a tendency to overwork themselves) with their time management while ensuring that our assessments are more equitable. And if the quality of work we receive drops off precipitously (or increases dramatically), we will have learned something important about how much time our students were spending before we made these expectations explicit.


Notes: The permanent home of this estimator can be found here, and a standalone version (which you are free to embed in your own website) can be found here.  Although our website tracks IPs that have visited our site and used the estimator, no inputs are recorded.

Estimation Details

Somewhat surprisingly, there is very little research about the amount of time it takes the average college student to complete common academic tasks. We have self-reported estimates of how much total time they spend on academic work outside of class (12-15 hours), but we don't know much about the quality and quantity of the work that is produced in that time frame (let alone how the time is allocated to different tasks). We also know quite a bit about how students tackle common academic tasks, but those studies rarely ask students to report on how long it takes them to complete the task (whether reading a book, writing a paper, or studying for an exam). The testing literature provides some clues (because valid instrument design depends on data about the average speed of test takers), but it's tough to generalize from the experience of taking high-stakes, timed tests to the experience of working on an assignment in the comfort of your dorm. And while there is a sizable literature on reading, the nature and purpose of the reading tasks in these experiments are also quite different from what students typically encounter in college.

All of which is to say the estimates to the right are just that: estimates.

To arrive at our estimates, we began with what we knew from the literature and then filled in the gaps by making a few key assumptions. The details of our calculations are below. If you still find our assumptions unreasonable, however, the estimator allows you to manually adjust our estimated rates. We also welcome those who have knowledge of research about which we are unaware to suggest improvements.

READING RATES

Of all the work students might do outside of class, we know the most about their reading. Educators, cognitive psychologists, and linguists have been studying how human beings read for more than a century. One of the best summaries of this extensive literature is the the late Keith Rayner's recently published "So Much to Read, So Little Time: How Do We Read, and Can Speed Reading Help?" A central insight of this piece (along with the literature it summarizes) is that none of us read at a constant rate. Instead, we use varying rates that depend on the difficulty and purpose of the reading task (Rayner et al., 2016; Love, 2012; Aronson, 1983; Carver, 1983, 1992; Jay and Dahl, 1975; Parker, 1962; Carrillo and Sheldon, 1952; Robinson, 1941). Another obvious (but rarely acknowledged) insight is that a page-based reading rate is going to vary by the number of words on the page. As a result, our estimator assumes that reading rate will be a function of three factors: 1) page density, 2) text difficulty, and 3) reading purpose. For the sake of simplicity, we limited the variation within each factor to three levels.

Page Density*

  • 450 words: Typical of paperback pages, as well as the 6" x 9" pages of academic journal articles
  • 600 words: Typical of academic monograph pages
  • 750 words: Typical of textbook pages that are 25% images, as well as the full-size pages of two-column academic journal articles 

*Estimates were determined by direct sampling of texts in our personal collection. Note that there is a *great* deal of variation among different texts, so we advise manually counting one page (if you don't have an electronic version, count the words on 3 or 4 lines, take the average, and multiply it by the number of lines on the page).

Text Difficulty

  • No New Concepts: The reader knows the meaning of each word and has enough background knowledge to immediately understand the ideas expressed
  • Some New Concepts: The reader is unfamiliar with the meaning of some words and doesn't have enough background knowledge to immediately understand some of the ideas expressed.
  • Many New Concepts: The reader is unfamiliar with the meaning of many words and doesn't have enough background knowledge to immediately understand most of the ideas expressed

Reading Purpose

  • Survey: Reading to survey main ideas; OK to skip entire portions of text
  • Understand:  Reading to understand the meaning of each sentence
  • Engage:  Reading while also working problems, drawing inferences, questioning, and evaluating

What we know from the research:

  • The optimal reading rate of the skilled adult reader (including college students) is around 300 words per minute. This assumes a "normal" reading environment in which there are no new words or concepts in the text and the purpose of the reading is to understand the meaning of each sentence (Rayner et al., 2016; Carver, 1982). 

  • Adults can read faster than 300 words per minute, but if the goal is to understand the meaning of sentences, rates beyond 300 words per minute reduce comprehension in a near linear fashion (Zacks and Treiman, 2016; Love, 2012; Carver, 1982).

  • The default reading rates of college students under these normal conditions can range from 100-400 words per minute (Rayner et al., 2016; Siegenthaler et al., 2011; Acheson et al., 2008; Carver, 1982, 1983, 1992; Underwood et al., 1990; Hausfeld, 1981; Just and Carpenter, 1980; Jay and Dahl, 1975; Grob, 1970; McLaughlin, 1969; Robinson and Hall, 1941).
  • There is no real upper limit on skimming speeds, but the average college student skims for main ideas at rates between 450 and 600 words per minute (Rayner et al., 2016; Carver 1992; Just and Carpenter, 1980; Jay and Dahl, 1975)

  • In conditions where the material is more difficult (i.e., with some new words and concepts), the optimal reading rate slows to 200 words per minute (Carver, 1992).

  • In conditions where the purpose is to memorize the text for later recall, the optimal reading rate slows even further to 138 words per minute or lower (Carver, 1992).

  • Although this has not been measured (to our knowledge), reading experts have argued that it is perfectly reasonable to slow down to rates below 50 words per minute if the goal is to engage a text (Parker, 1962). 

What we don't know, but deduce and/or stipulate:

  • Given that the rates above were discovered in laboratory conditions, when subjects are asked to perform in short, time-constrained intervals, we assume that the reading rates in actual conditions, when students read for longer periods with periodic breaks, will be slightly slower.

  • Because there is no research on the time it takes students to engage texts, we assume that the rates would be similar to the rates found when students are asked to memorize a text for later recall. Although these are incredibly different tasks, both require attention to details alongside additional processing. If anything, we imagine equating these two rates significantly underestimates the time it takes to read for engagement (for an example of the sort of reading that is likely to take more time than it takes to memorize, see the appendix of Perry et al., 2015). 

  • If the reading purpose remains the same, the change in reading rates across text difficulty levels is linear.

  • The rate of change in reading rates across text difficulty levels is the same across reading purposes.

Combining what we know with what we assume allows us to construct the following table of estimated reading rates (with rates about which we are most confident in yellow):

WRITING RATES

Sadly, we know much less about student writing rates than we do about reading rates. This is no doubt because writing rates vary even more widely than reading rates. Nevertheless, we've found at least one study that can give us a place to begin. In "Individual Differences in Undergraduate Essay-Writing Strategies," Mark Torrance and his colleagues find (among other things) that 493 students reported spending anywhere between 9 to 15 hours on 1500-word essays. In these essays, students were asked to produce a "critical description and discussion of psychological themes" using at least one outside source. Torrance and his colleagues also show that students who spent the least time reported no drafting, while those who spent the most time reported multiple drafts, along with detailed outlining and planning. And the students who spent the most time received higher marks than those who spent the least (Torrance et al., 2000).

Although the sample of this study is sizable, we should not read too much into a single result of student self-reports about a single assignment from a single institution.  But to arrive at our estimates, we must. Users should simply be aware that the table below is far more speculative than our reading rate estimates. And that the time your students spend on these tasks is likely to vary from these estimates in significant ways.

As with reading rates, we assume that writing rates will be a function of a variety of factors. The three we take into account are 1) page density, 2) text genre, 3) degree of drafting and revision.

Page Density

  • 250 words: Double-Spaced, Times New Roman, 12-Point Font, 1" Margins
  • 500 words: Single-Spaced, Times New Roman, 12-Point Font, 1" Margins

Text Genre

  • Reflection/Narrative: Essays that require very little planning or critical engagement with content  
  • Argument: Essays that require critical engagement with content and detailed planning, but no outside research
  • Research: Essays that require detailed planning, outside research, and crtical engagement

Drafting and Revision

  • No Drafting: Students submit essays that were never revised
  • Minimal Drafting: Students submit essays that were revised at least once
  • Extensive Drafting: Students submit essays that were revised multiple times

What we assume to arrive at our estimates:

  • The results of the Torrance study are reasonably accurate.
     
  • The assignment in the study falls within the "argument" genre. It's hard to tell without more details, but "critical description and discussion" seems to imply more than reflection. And while an outside source was required, finding and using a single source falls well below the expectations of a traditional research paper.
     
  • Students write at a constant rate. That is, we assume that a student writing the same sort of essay will take exactly twice as much time to write a 12 page paper as she takes to write a 6 page paper. There are good reasons to think this assumption is unrealistic, but because we have no way of knowing how much rate might shift over the course of a paper, we assume constancy.
     
  • Students will spend less time writing a reflective or narrative essay than they spend constructing an argumentative essay (assuming the same degree of drafting and revision). For simplicity's sake, we assume they will spend exactly half the time. It's highly unlikely to be this linear, but we don't know enough to make a more accurate assumption.
     
  • Students will spend more time writing a research paper than they spend on their argumentative essays. Again, for simplicity's sake, we assume they will spend exactly twice the amount of time. It's not only unlikely to be this linear, it's also likely to vary greatly by the amount of outside reading a student does and the difficulty of the sources he or she tackles.

These assumptions allow us to construct the following table of estimated writing rates (with rates about which we are most confident in yellow):


Bibliography

Aaronson, Doris, and Steven Ferres. “Lexical Categories and Reading Tasks.” Journal of Experimental Psychology: Human Perception and Performance 9, no. 5 (1983): 675–99. doi:10.1037/0096-1523.9.5.675.

Acheson, Daniel J., Justine B. Wells, and Maryellen C. MacDonald. “New and Updated Tests of Print Exposure and Reading Abilities in College Students.” Behavior Research Methods 40, no. 1 (2008): 278–89. doi:10.3758/BRM.40.1.278.

Carrillo, Lawrence W., and William D. Sheldon. “The Flexibility of Reading Rate.” Journal of Educational Psychology 43, no. 5 (1952): 299–305. doi:10.1037/h0054161.

Carver, Ronald P. “Is Reading Rate Constant or Flexible?” Reading Research Quarterly 18, no. 2 (1983): 190–215. doi:10.2307/747517.

———. “Optimal Rate of Reading Prose.” Reading Research Quarterly 18, no. 1 (1982): 56–88. doi:10.2307/747538.

———. “Reading Rate: Theory, Research, and Practical Implications.” Journal of Reading 36, no. 2 (1992): 84–95.

Dehaene, Stanislas. Reading in the Brain: The New Science of How We Read. Reprint edition. New York: Penguin Books, 2010.

Grob, James A. “Reading Rate and Study-Time Demands on Secondary Students.” Journal of Reading 13, no. 4 (1970): 285–88.

Hausfeld, Steven. “Speeded Reading and Listening Comprehension for Easy and Difficult Materials.” Journal of Educational Psychology 73, no. 3 (1981): 312–19. doi:10.1037/0022-0663.73.3.312.

Jay, S., and Patricia R. Dahl. “Establishing Appropriate Purpose for Reading and Its Effect on Flexibility of Reading Rate.” Journal of Educational Psychology 67, no. 1 (1975): 38–43. doi:10.1037/h0078669.

Just, Marcel A., and Patricia A. Carpenter. “A Theory of Reading: From Eye Fixations to Comprehension.” Psychological Review 87, no. 4 (1980): 329–54. doi:10.1037/0033-295X.87.4.329.

Love, Jessica. “Reading Fast and Slow.” The American Scholar, March 1, 2012.

McLaughlin, G. Harry. “Reading at ‘Impossible’ Speeds.” Journal of Reading 12, no. 6 (1969): 449–510.

Parker, Don H. “Reading Rate Is Multilevel.” The Clearing House 36, no. 8 (1962): 451–55.

Perry, John, Michael Bratman, and John Martin Fischer. “Appendix: Reading Philosophy.” In Introduction to Philosophy: Classical and Contemporary Readings, 7 edition. New York City, NY: Oxford University Press, 2015.

Rayner, Keith, Elizabeth R. Schotter, Michael E. J. Masson, Mary C. Potter, and Rebecca Treiman. “So Much to Read, So Little Time How Do We Read, and Can Speed Reading Help?” Psychological Science in the Public Interest 17, no. 1 (May 1, 2016): 4–34. doi:10.1177/1529100615623267.

Robinson, F., and P. Hall. “Studies of Higher-Level Reading Abilities.” Journal of Educational Psychology 32, no. 4 (1941): 241–52. doi:10.1037/h0062111.

Siegenthaler, Eva, Pascal Wurtz, Per Bergamin, and Rudolf Groner. “Comparing Reading Processes on E-Ink Displays and Print.” Displays 32, no. 5 (December 2011): 268–73. doi:10.1016/j.displa.2011.05.005.

Torrance, Mark, Glyn V. Thomas, and Elizabeth J. Robinson. “Individual Differences in Undergraduate Essay-Writing Strategies: A Longitudinal Study.” Higher Education 39, no. 2 (2000): 181–200.

Underwood, Geoffrey, Alison Hubbard, and Howard Wilkinson. “Eye Fixations Predict Reading Comprehension: The Relationships between Reading Skill, Reading Speed, and Visual Inspection.” Language and Speech 33, no. 1 (January 1, 1990): 69–81.

Wolf, Maryanne. Proust and the Squid: The Story and Science of the Reading Brain. Reprint edition. New York: Harper Perennial, 2008.

Zacks, Jeffrey M., and Rebecca Treiman. “Sorry, You Can’t Speed Read.” The New York Times, April 15, 2016.