BACK
. © J Charles Alderson  novELTy Volume 7, Number 1.  All rights reserved.
Exploding Myths: Does the number of hours per week matter? 1

J Charles Alderson

Introduction

In discussions around the new Hungarian National Core Curriculum for modern languages in particular, and especially for English, it has been my experience that the most heated debates relate not to the content of the curriculum, but to the number of hours per week that are prescribed. The National Core Curriculum claims that the objectives can be achieved under normal circumstances in 50-70% of the time resource allocated. For modern foreign languages, for grades 1 to 6 11-15% of total timetabled time is recommended, whereas from grades 7 to 10, between 9 and 13% of time is recommended. However, teachers and would-be curriculum designers alike seem to believe that three hours per week is not sufficient to attain anything like a decent standard of English, and thus they call for an increase to four, or in some cases even to five hours per week. However, even though many school principals are sympathetic to this argument and make every effort to increase the weekly contact hours, it is understandable that many cannot do so, even with the best will in the world, since there are only so many hours that can be timetabled, and other curricular subjects have equally legitimate demands on timetabled time.

Unfortunately, very little evidence has been produced to show that more hours are needed. Why should we believe what some teachers assert? An increase in one hour does not seem prima facie to make much difference. Why should the number of hours alone matter? Surely what matters most is the quality of the teaching that students receive, rather than its quantity. And if teaching is bad, surely five hours a week is likely to be even more negative than three hours.

For the first time, however, evidence is now available to inform the debate. The Examination Reform Project for English (for further details on the project see Nagy Edit’s report in this issue) has recently piloted experimental exams on over 1,000 secondary school pupils in Years 10 and 12, and has been able to analyse the results in terms of self-reported bio-data, including number of hours studied per week, and the number of years that students have been studying English. 

Design of the Study

In a three-week period in April, 1999, experimental tests of reading, listening and use of English were piloted in 27 schools throughout Hungary. Of these, 6 were in Budapest and the rest in the provinces.

In order to trial as many test items as possible, the tests were compiled into six test booklets: two test booklets of Listening, and four for Reading/ Use of English. All students took a test of reading and of use of English, and approximately 500 also took a test of listening. Any one test booklet, therefore, was taken by approximately 250 students (see Number of cases in Table 1). In total, across the six booklets there were 5 Listening tasks with 42 items, 13 Reading tasks with 105 items, and 7 Use of English tasks with 80 items. Since each item was taken by about 250 students, reliable statistics could be calculated on item difficulty and discrimination (i.e. how well each item distinguishes between more and less proficient students). The results of the tests, including mean (average) scores (where one item got one point), standard deviations (s.d. - how well the test spreads the students out) and reliability indices (alpha) of the tests are given in the following table.

As is apparent from the reliability, standard deviation and mean discrimination figures, all tests discriminated well among the population, and all were highly reliable. They tended to be somewhat difficult, especially the second listening booklet (mean 31%), and the fourth reading/use of English booklet (mean 28%). However, the booklets were not intended to be of equal difficulty. The supposed difficulties of tasks within each booklet varied, to ensure a spread of difficulties across a spread of the population. The spread was as follows:


Anchor items are those which are common to each booklet. There were 10 anchor Listening items, 10 anchor Reading items and 19/20 anchor Use of English items.  Their use makes it possible to compare each person’s score on the anchor items with their score on the items being piloted, and thus to calibrate item difficulty onto a common scale. Using this scale, it was then possible to arrive at a calibrated score of each person’s ability, and thus to arrive at a measure of each individual’s ability, regardless of which combination of tests that student had taken. (The anchor tests had been developed in an earlier joint project between the Hungarian National Institute of Education [OKI] and CITO, the Dutch National Testing Agency, in 1993-5.)

The pilot sample was made up of pupils in both Years 10 and 12 (Years 2 and 4 of upper secondary). Just over 1,000 pupils in total took the tests, but a number had to be dropped from the analyses, either because they could not be calibrated (person misfit) or because they did not complete one or other of the tests. We were left with a sample of 944 for the analysis.

Pupils were asked, in a questionnaire in Hungarian, to give some personal details, such as age, sex, years learning English, hours per week, whether they were taking private lessons and if so for how many years and how many hours per week, other languages being studied at school (for how many years and hours per week), whether they had passed the State Foreign Language Exam and if so, at what level, whether they had passed any international language exams, whether they intended to take the “érettségi” (school-leaving exam) in English, and whether they planned to take the University Entrance Examination in English.

The age, sex and year distribution was as follows:

In terms of school types, the sample was as follows:


A total of 107 pupils (11%) were taking private lessons, of whom 40% had had one year of lessons so far, and 23% two years. The commonest frequency was one (40%) or two (43%) hours per week. 676 pupils (72%) were taking another foreign language - German the most frequent (47% of the whole population), French second at 10%, and Italian, Latin, Spanish and Russian next in that order. Only 12 pupils had passed the Advanced State Foreign Language exam, but 150 (16%) had passed the Intermediate exam, of which 51 had passed type A, and 92 type C. Only 12 had passed the Basic State Foreign Language exam. 32 (3%) had passed an international exam, the majority (9) taking Cambridge First Certificate English. 682 pupils (76% of those responding) intended to take the English “érettségi”, and only 145 (19%) intended to take the University Entrance Exam in English.

In the absence of national statistics, it is difficult to say whether this is representative of the whole English learning population. Although no attempt was made to create a strictly representative sample, every attempt was made to ensure a spread across the variables thought likely to be most important, and the size of the sample is also likely to have contributed to ensuring the robust nature of the sample.

Whether this is a fully representative sample or not, we clearly have enough data to begin to explore the language proficiency of this population, in relation to various characteristics reported on the questionnaire. This was done in one of two ways: by comparing mean scores and the spread of scores (variance) of two groups, using the independent sample t-test, and by contrasting the means and variance of a number of groups, by analysis of variance, and making multiple comparisons by post-hoc tests.

Results

In what follows, it is important to note that the scores reported are not raw scores, nor are they percentage scores: they are the calibrated scores, calculated as explained in the previous section. The results are reported as answers to questions about the English proficiency of sections of the pilot sample.
 

1. What difference does it make whether a student is in Year 10 or Year 12?

2. What differences does it make whether the student is male or female?

There is no statistically significant difference between the mean scores of 9.57 and 9.46.  Despite beliefs to the contrary, there is no evidence in this sample that girls are better at English than boys.

3. What difference does it make which type of school a student studies in?

The statistical analyses show clearly that these are three distinct groups: grammar school students do better than combined grammar/vocational school students, who do better than vocational school students. This accords with intuitions and the known difference between school types.

4. What difference does it make how many years a student studies English?

In order to get a meaningful picture, it is important to examine the results for students in Year 10 separately from those now in Year 12. If we take Year 12 students only, we see the following mean scores.

The ONLY statistically significant difference among these average scores is between 4 years of English and 10 years. If we take the Year 10 students only, we see the following mean scores:

Again, the ONLY statistically significant contrasts are between 2 years of English, and 8, 9 or 10 years. All other contrasts did not show any significant difference.

5. What difference does it make how many hours per week a student reportedly studies English?

If we take Year 10 only, we see the following mean scores:

As can be seen in Table 10, the number of hours per week reported varies considerably, from 2 to 9. The majority of students report either 3 (n=43, 15%), 4 (n=70, 24%) or 5 (n=124, 43%) lessons per week. Therefore, we took 3, 4 and 5 hours per week only and contrasted the average scores for each of the pairs. Table 11 summarizes the results of this analysis.

Statistical tests show that having 3 hours per week results in significantly lower scores than 4 or 5 hours per week but there is no significant difference between 4 or 5 hours per week, for Year 10 students. 

If we look at Year 12 only, we find the following numbers of hours per week:

It seems sensible to ignore the small numbers and again to concentrate on 3, 4, 5 hours per week. When we do this, we find the following mean scores:

Statistical tests show that there are significant differences between 3, 4 and 5 hours of learning English per week: 3 is significantly different from 4 and 5, and 4 is significantly different from 5. Teachers’ beliefs appear to be vindicated.

However, if we look at the results more carefully, and examine the mean scores, we see that the highest scores are achieved by students studying for ONLY 3 hours per week, and those studying 4 hours per week score lowest!

In fact, I would argue that although these mean scores are significantly different statistically, they are NOT meaningfully different (otherwise you would have to argue that to guarantee the highest proficiency in English, students should only learn English for three hours a week!). A more sensible interpretation of these data is that it MAKES NO MEANINGFUL DIFFERENCE whether students study for 3, 4 or 5 hours per week, at least in Year 12 - the bulk of our sample. Of course, people will argue that this is because the best students have taken the State Foreign Language exam and are no longer in our sample. Let us then examine this claim.

6. What difference does it make if students have or have not passed the Intermediate level State Foreign Language exam?

In Year 10, only 5 pupils have passed the Intermediate Level State Foreign Language exam, the majority of those with the Intermediate State Foreign Language exam are in Year 12: 45 have passed type A and 88 Type C.

If we concentrate on Year 12 and compare the scores of those pupils who have passed State Foreign Language Intermediate exam with those who have not, we get the following results:

There is a very clear difference in scores between those with and those without the State Foreign Language exam, as expected. 

We can also contrast, again in Year 12 only, the number of hours per week taken only by those who HAVE passed the Intermediate State Foreign Language exam (Table 15). The analyses show a significant difference between mean scores for those taking 3 hours a week and those taking 5 hours per week, but there is no significant difference between 3 hours a week and 4 hours a week, nor between 4 hours a week and 5 hours a week. 

If we remove from the Year 12 group those pupils who HAVE passed the Intermediate State Foreign Language exam, and contrast the scores of those who have NOT passed State Foreign Language exam, according to the number of hours per week, (Table 16) we find that the only significant differences are between 3 and 4, and 3 and 5. 4 is not significantly different from 5. However, yet again, we find that pupils taking 3 hours per week have a HIGHER mean score than those taking 4 or 5 hours per week!

Thus the lack of effect of hours per week cannot be due to the absence of large numbers of students who have passed State Foreign Language exam. Of course, it can be argued that students also take private lessons in addition to school classes, and that this will distort the effect of class hours. We examine this claim next.

7. Effect of taking private lessons on proficiency

If we contrast those students taking private lessons with those who are not, we find no significant difference between those 538 Year 12 pupils with no private lessons and those 33 pupils with one year’s worth of private lessons already (9.59 compared with 9.78). Although the 16 pupils reporting two years of private lessons had a higher mean score (10.01), statistically this was not significantly different from not having private lessons.

If we then add the number of hours taken in private lessons to the number of hours taken in school, to get a total number of hours of English, we find the following data for Year 12.

If we then contrast, as before, the three most frequent groups - 3, 4 and 5 hours per week - we get the following results, again for Year 12.

Statistical tests show that the only significant difference in hours per week is between 3 and 4 (where 3 hours per week is still the highest mean score). A similar result applies if we remove from the Year 12 sample those who have passed State Foreign Language exam, and contrast their proficiency according to the total number of hours they report taking English, in school and privately.

Summary and conclusion

The main findings of this investigation are as follows.
 

  1. There are significant differences in English proficiency (EP) between Year 10 and Year 12 pupils, and among the three school types, all in the expected direction.
  2. There are no significant differences in EP between boys and girls.
  3. The only significant difference in EP for years studied by Year 12 pupils is between 4 years and 10 years.
  4. The only significant difference in EP for years studied by Year 10 pupils is between 2 years and 8/9/10 years.
  5. The only significant difference in EP for hours per week for Year 10 pupils is 3 hours and 4/5 hours.
  6. The only significant difference in EP for hours per week for Year 12 pupils is 3 hours and 4/5 hours. BUT pupils with three hours per week score higher than the other two groups.
  7. The same is true for those pupils who have passed the State Foreign Language exam, looked at separately, and for those students who have not taken State Foreign Language exam, again looked at separately.
  8. There are significant differences in EP between those pupils who have passed the Intermediate State Foreign Language exam and those who have not.
  9. There is no difference in EP among pupils who have no private lessons, those who have one year of private lessons and those who have two years of private lessons.
  10. The only difference in EP for the total number of hours studying English (including private lessons) is between those studying for 3 hours per week and those studying for 4 hours per week. Those studying only 3 hours per week score higher!
These results suggest the need for a re-scrutiny of the belief that the number of hours a pupil studies English, or even the number of years, makes a significant difference. What is likely to be far more important than either variable, is the quality of English teaching to which pupils are exposed. Thus, debates about whether there should be 3, 4 or 5 hours per week devoted to English are futile, and should be abandoned. Instead, attention should be turned to measuring the quality of the teaching, and maximising its impact.

Notes

  1. This paper was originally delivered at the 9th  IATEFL-Hungary Conference in Gyõr, October 1999.(back)

Charles Alderson is currently contracted by the British Council to advise on the English Examination Reform Project and to act as Director of Studies of the PhD in Language Pedagogy at ELTE Budapest. He is Professor of Linguistics and English Language Education at Lancaster University, a post to which he returns from leave of absence in September 2000.