Listening: a vital skill
One of the first things new teachers learn is that simply exposing students to lots of listening will quickly increase their ability to communicate orally. Even beginners armed with a very limited repertoire of vocabulary and grammar can often get their ideas across provided they generally understand what other people are saying to them. In fact, Feyton is often cited as estimating that listening makes up a full 45% of what we do in a language. So invariably the assessment of listening will be of paramount importance in the overall evaluation of our learners’ communicative ability. And yet it remains one of the most elusive skills to reliably test.
What are we testing anyway?
Let us simply consider the issue of what we set out to test – a concept known as the test “construct.” On one end of the spectrum we might want to consider bottom-up processing of auditory input such as the ability to distinguish between particular phonemes or recognize particular words or word boundaries. Certainly we could all agree that these skills play their part in being a competent listener. But today’s teachers trained in a more communicative methodology would probably be more familiar working with top-down skills which help students understand meaning by making inferences about things like context, implied social relationships, situational cues and gist. Perhaps we need to focus more on this end of the spectrum to reflect the communicative teaching we use in our classes?
How communicative can a test be?
The problem, however, is that no matter how communicative our teaching methods might be, applying this to test design is a tricky matter. What would a communicative test look like? A key feature which is often pointed to is the authenticity of the listening input. But in order to make sure a test is reliable and valid we often need to control or adapt this input for the test-taker. We might also consider making the tasks the test-taker needs to complete as close to a real world skill as possible. But few real world tasks lend themselves to easy or consistent marking.
Gary Buck has reached the conclusion that a truly communicative listening test is probably more of a holy grail than something which is truly achievable in practice. Instead, he says:
“In reality we should not be looking at the test, but at the interaction between the test tasks and the test-taker. If a test is well designed, it will allow us to make inferences about the test-taker’s communicative language ability, and that is what we are interested in.”
What we need to try to do then is strike a balance to design a test which is practical while at the same time giving us an idea about how the student would fare in an authentic situation. A gap-fill, for instance, may not be the most realistic task imaginable, but we can provide the test-taker with relatively authentic listening input and the gaps can be designed to replicate a note-taking task which mirrors a real world skill.
Another issue arises when considering that skills in the real world are rarely, if ever, used in isolation, but are integrated to lead to communicative goals. Listening, for example is often tested, albeit indirectly, when testing speaking as these skills obviously go hand in hand. Often this will be captured when looking at criteria around mediation skills such as “interactive communication” or “sociolinguistic appropriateness.” And by their very design tests will almost invariably ask the students to use the skills of reading and writing as they need to decode instructions and test items on the page and offer a written response. Dictation, particularly, is one type of listening assessment task where the integration of writing and listening is patently clear.
Dictation making a comeback
Prior to the inclusion of recorded material in teaching resources, dictation was virtually the only way to provide students with listening input. While many teachers may think of dictation as old-fashioned, a recent survey conducted in Spain shows that more teachers use dictation than do not, and in fact a full 30% rely on it as a regular part of their teaching. Also variations on it, such as dictogloss (see reference to Wajnryb), are commonly used for the teaching of grammar. But let’s take a fresh look at it in the context of language testing.
Dictation normally involves listening to a short text once straight through and then again divided into smaller chunks with pauses. This will decrease the cognitive load (effectively how much they can remember in the short term) on the test-taker. Then the test-taker needs to accurately write what they have heard during the pauses, and finally they are given a chance to check over what they have written for mistakes.
You may notice that dictation covers both bottom-up processing (decoding speech, word recognition, etc.) as well as top-down skills as students need to make sure the meaning is faithful to the message that they have heard. In this way the range of skills being tested is quite wide compared to other types of listening exam tasks. Also the integration of both writing and listening mirrors some real world needs for encoding spoken messages into written ones. It is important to note that a good practice is to make sure that dictations are reported both in the listening and the writing mark on the exam, as is done on the dictation for PTE-General, for example.
Getting the balance right
No listening test is perfect. Trade-offs are invariably made which affect the degree to which a test might be considered authentic, or communicative in favor of practicality, but the true measure of a sound listening test is the degree to which it allows us to make reasonably valid and reliable observations about the test-taker’s ability. And whether we are preparing our students for an external exam or creating a test of our own it is important for us as teachers to be aware of how these choices are made.
For the introductory post in this series please go here.
Feyton, C. (1991). The power of listening ability: an overlooked dimension in language acquisition. The Modern Language Journal 75, (pp.173-180).
Buck, G. (2001). Assessing Listening. Cambridge UK: Cambridge University Press, (p. 92).
Wajnryb, R. (1990). Grammar Dictation. Oxford University Press.